Mar 13 10:33:17.192691 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 10:33:17.899845 master-0 kubenswrapper[3972]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:33:17.902517 master-0 kubenswrapper[3972]: I0313 10:33:17.902241 3972 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 10:33:17.908415 master-0 kubenswrapper[3972]: W0313 10:33:17.908356 3972 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:33:17.908415 master-0 kubenswrapper[3972]: W0313 10:33:17.908392 3972 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908440 3972 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908454 3972 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908465 3972 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908474 3972 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908482 3972 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908491 3972 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908500 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908509 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908517 3972 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908526 3972 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908535 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908544 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908553 3972 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908561 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908570 3972 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908578 3972 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908587 3972 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908595 3972 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:33:17.908604 master-0 kubenswrapper[3972]: W0313 10:33:17.908603 3972 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908613 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908621 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908631 3972 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908640 3972 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908648 3972 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908658 3972 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908666 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908675 3972 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908683 3972 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908692 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908700 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908709 3972 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908718 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908726 3972 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908735 3972 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908743 3972 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908752 3972 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908760 3972 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908768 3972 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:33:17.909656 master-0 kubenswrapper[3972]: W0313 10:33:17.908777 3972 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908788 3972 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908798 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908811 3972 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908822 3972 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908832 3972 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908841 3972 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908850 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908859 3972 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908868 3972 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908876 3972 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908884 3972 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908892 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908901 3972 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.908997 3972 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.909051 3972 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.909062 3972 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.909072 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.909139 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:33:17.910709 master-0 kubenswrapper[3972]: W0313 10:33:17.909150 3972 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909159 3972 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909168 3972 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909177 3972 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909223 3972 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909233 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909244 3972 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909253 3972 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909261 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909270 3972 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909279 3972 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909288 3972 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: W0313 10:33:17.909297 3972 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909562 3972 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909582 3972 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909597 3972 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909609 3972 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909621 3972 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909631 3972 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909644 3972 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909655 3972 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909666 3972 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 10:33:17.911731 master-0 kubenswrapper[3972]: I0313 10:33:17.909676 3972 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909687 3972 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909698 3972 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909708 3972 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909718 3972 flags.go:64] FLAG: --cgroup-root="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909750 3972 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909760 3972 flags.go:64] FLAG: --client-ca-file="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909779 3972 flags.go:64] FLAG: --cloud-config="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909789 3972 flags.go:64] FLAG: --cloud-provider="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909798 3972 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909812 3972 flags.go:64] FLAG: --cluster-domain="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909822 3972 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909832 3972 flags.go:64] FLAG: --config-dir="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909842 3972 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909852 3972 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909865 3972 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909875 3972 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909885 3972 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909895 3972 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909905 3972 flags.go:64] FLAG: --contention-profiling="false" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909916 3972 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909926 3972 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909937 3972 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909946 3972 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909958 3972 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 10:33:17.912896 master-0 kubenswrapper[3972]: I0313 10:33:17.909968 3972 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.909978 3972 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.909987 3972 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.909997 3972 flags.go:64] FLAG: --enable-server="true" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910007 3972 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910019 3972 flags.go:64] FLAG: --event-burst="100" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910030 3972 flags.go:64] FLAG: --event-qps="50" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910039 3972 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910050 3972 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910060 3972 flags.go:64] FLAG: --eviction-hard="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910071 3972 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910081 3972 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910138 3972 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910149 3972 flags.go:64] FLAG: --eviction-soft="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910168 3972 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910178 3972 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910187 3972 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910197 3972 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910207 3972 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910217 3972 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910227 3972 flags.go:64] FLAG: --feature-gates="" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910239 3972 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910249 3972 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910259 3972 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910269 3972 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910279 3972 flags.go:64] FLAG: --healthz-port="10248" Mar 13 10:33:17.914235 master-0 kubenswrapper[3972]: I0313 10:33:17.910289 3972 flags.go:64] FLAG: --help="false" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910299 3972 flags.go:64] FLAG: --hostname-override="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910309 3972 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910319 3972 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910328 3972 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910358 3972 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910369 3972 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910379 3972 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910388 3972 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910397 3972 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910408 3972 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910418 3972 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910428 3972 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910438 3972 flags.go:64] FLAG: --kube-reserved="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910448 3972 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910457 3972 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910467 3972 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910477 3972 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910502 3972 flags.go:64] FLAG: --lock-file="" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910520 3972 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910530 3972 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910540 3972 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910574 3972 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910584 3972 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910593 3972 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 10:33:17.915607 master-0 kubenswrapper[3972]: I0313 10:33:17.910603 3972 flags.go:64] FLAG: --logging-format="text" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910613 3972 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910624 3972 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910634 3972 flags.go:64] FLAG: --manifest-url="" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910643 3972 flags.go:64] FLAG: --manifest-url-header="" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910656 3972 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910666 3972 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910678 3972 flags.go:64] FLAG: --max-pods="110" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910688 3972 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910698 3972 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910708 3972 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910717 3972 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910728 3972 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910738 3972 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910748 3972 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910769 3972 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910780 3972 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910790 3972 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910801 3972 flags.go:64] FLAG: --pod-cidr="" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910810 3972 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910824 3972 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910834 3972 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910845 3972 flags.go:64] FLAG: --pods-per-core="0" Mar 13 10:33:17.916851 master-0 kubenswrapper[3972]: I0313 10:33:17.910855 3972 flags.go:64] FLAG: --port="10250" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910865 3972 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910888 3972 flags.go:64] FLAG: --provider-id="" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910919 3972 flags.go:64] FLAG: --qos-reserved="" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910930 3972 flags.go:64] FLAG: --read-only-port="10255" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910940 3972 flags.go:64] FLAG: --register-node="true" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910950 3972 flags.go:64] FLAG: --register-schedulable="true" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910960 3972 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.910975 3972 flags.go:64] FLAG: --registry-burst="10" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911005 3972 flags.go:64] FLAG: --registry-qps="5" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911015 3972 flags.go:64] FLAG: --reserved-cpus="" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911025 3972 flags.go:64] FLAG: --reserved-memory="" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911036 3972 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911046 3972 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911056 3972 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911066 3972 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911075 3972 flags.go:64] FLAG: --runonce="false" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911085 3972 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911121 3972 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911132 3972 flags.go:64] FLAG: --seccomp-default="false" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911141 3972 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911152 3972 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911162 3972 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911172 3972 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911182 3972 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 10:33:17.918139 master-0 kubenswrapper[3972]: I0313 10:33:17.911191 3972 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911203 3972 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911213 3972 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911224 3972 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911236 3972 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911246 3972 flags.go:64] FLAG: --system-cgroups="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911258 3972 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911274 3972 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911283 3972 flags.go:64] FLAG: --tls-cert-file="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911309 3972 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911325 3972 flags.go:64] FLAG: --tls-min-version="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911335 3972 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911344 3972 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911354 3972 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911364 3972 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911373 3972 flags.go:64] FLAG: --v="2" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911386 3972 flags.go:64] FLAG: --version="false" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911398 3972 flags.go:64] FLAG: --vmodule="" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911409 3972 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: I0313 10:33:17.911420 3972 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: W0313 10:33:17.911638 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: W0313 10:33:17.911649 3972 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: W0313 10:33:17.911659 3972 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: W0313 10:33:17.911668 3972 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:33:17.919514 master-0 kubenswrapper[3972]: W0313 10:33:17.911677 3972 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911689 3972 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911702 3972 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911714 3972 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911725 3972 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911735 3972 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911745 3972 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911756 3972 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911765 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911775 3972 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911784 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911792 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911801 3972 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911813 3972 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911823 3972 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911832 3972 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911843 3972 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911858 3972 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911902 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:33:17.920765 master-0 kubenswrapper[3972]: W0313 10:33:17.911911 3972 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911920 3972 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911928 3972 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911937 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911945 3972 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911954 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911963 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911971 3972 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911980 3972 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911988 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.911997 3972 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912006 3972 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912014 3972 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912022 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912031 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912040 3972 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912049 3972 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912057 3972 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912065 3972 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912074 3972 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:33:17.921834 master-0 kubenswrapper[3972]: W0313 10:33:17.912082 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912115 3972 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912124 3972 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912132 3972 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912141 3972 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912149 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912158 3972 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912167 3972 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912178 3972 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912188 3972 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912199 3972 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912243 3972 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912252 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912261 3972 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912270 3972 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912278 3972 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912287 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912296 3972 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912304 3972 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912312 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:33:17.922880 master-0 kubenswrapper[3972]: W0313 10:33:17.912321 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912329 3972 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912338 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912346 3972 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912355 3972 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912364 3972 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912373 3972 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912381 3972 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: W0313 10:33:17.912390 3972 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:33:17.924219 master-0 kubenswrapper[3972]: I0313 10:33:17.912421 3972 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:33:17.926178 master-0 kubenswrapper[3972]: I0313 10:33:17.926050 3972 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 10:33:17.926178 master-0 kubenswrapper[3972]: I0313 10:33:17.926146 3972 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 10:33:17.926333 master-0 kubenswrapper[3972]: W0313 10:33:17.926292 3972 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:33:17.926333 master-0 kubenswrapper[3972]: W0313 10:33:17.926310 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:33:17.926333 master-0 kubenswrapper[3972]: W0313 10:33:17.926322 3972 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:33:17.926333 master-0 kubenswrapper[3972]: W0313 10:33:17.926334 3972 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926343 3972 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926352 3972 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926360 3972 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926368 3972 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926377 3972 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926385 3972 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926393 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926401 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926410 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926419 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926427 3972 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926434 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926442 3972 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926450 3972 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926458 3972 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926466 3972 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926474 3972 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926482 3972 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926489 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926497 3972 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:33:17.926578 master-0 kubenswrapper[3972]: W0313 10:33:17.926524 3972 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926541 3972 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926551 3972 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926563 3972 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926573 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926582 3972 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926590 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926599 3972 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926608 3972 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926616 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926625 3972 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926633 3972 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926641 3972 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926649 3972 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926657 3972 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926665 3972 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926672 3972 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926680 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926688 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926696 3972 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:33:17.927831 master-0 kubenswrapper[3972]: W0313 10:33:17.926703 3972 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926712 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926719 3972 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926727 3972 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926735 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926746 3972 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926758 3972 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926768 3972 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926776 3972 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926784 3972 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926792 3972 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926800 3972 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926808 3972 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926816 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926838 3972 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926846 3972 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926862 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926870 3972 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926880 3972 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:33:17.929377 master-0 kubenswrapper[3972]: W0313 10:33:17.926888 3972 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926895 3972 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926903 3972 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926914 3972 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926924 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926934 3972 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926945 3972 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926955 3972 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.926964 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: I0313 10:33:17.926977 3972 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.927246 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.927260 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.927270 3972 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.927279 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:33:17.930519 master-0 kubenswrapper[3972]: W0313 10:33:17.927286 3972 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927294 3972 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927302 3972 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927310 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927319 3972 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927327 3972 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927335 3972 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927343 3972 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927354 3972 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927365 3972 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927373 3972 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927381 3972 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927389 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927409 3972 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927426 3972 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927434 3972 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927442 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927450 3972 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927457 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:33:17.931723 master-0 kubenswrapper[3972]: W0313 10:33:17.927466 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927473 3972 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927484 3972 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927492 3972 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927500 3972 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927508 3972 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927516 3972 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927525 3972 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927534 3972 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927543 3972 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927550 3972 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927559 3972 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927567 3972 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927575 3972 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927583 3972 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927590 3972 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927598 3972 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927606 3972 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927614 3972 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927621 3972 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:33:17.934557 master-0 kubenswrapper[3972]: W0313 10:33:17.927629 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927637 3972 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927644 3972 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927652 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927660 3972 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927667 3972 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927677 3972 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927686 3972 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927694 3972 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927701 3972 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927709 3972 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927717 3972 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927725 3972 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927732 3972 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927740 3972 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927748 3972 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927755 3972 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927763 3972 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927772 3972 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927780 3972 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927788 3972 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:33:17.935957 master-0 kubenswrapper[3972]: W0313 10:33:17.927796 3972 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927804 3972 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927814 3972 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927824 3972 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927833 3972 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927841 3972 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927850 3972 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: W0313 10:33:17.927861 3972 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: I0313 10:33:17.927875 3972 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:33:17.937701 master-0 kubenswrapper[3972]: I0313 10:33:17.928232 3972 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 10:33:17.939423 master-0 kubenswrapper[3972]: I0313 10:33:17.939371 3972 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 13 10:33:17.941312 master-0 kubenswrapper[3972]: I0313 10:33:17.941265 3972 server.go:997] "Starting client certificate rotation" Mar 13 10:33:17.941504 master-0 kubenswrapper[3972]: I0313 10:33:17.941386 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 10:33:17.941732 master-0 kubenswrapper[3972]: I0313 10:33:17.941677 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 10:33:17.970658 master-0 kubenswrapper[3972]: I0313 10:33:17.970521 3972 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:33:17.973943 master-0 kubenswrapper[3972]: I0313 10:33:17.973844 3972 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:33:17.983415 master-0 kubenswrapper[3972]: E0313 10:33:17.983298 3972 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:18.001378 master-0 kubenswrapper[3972]: I0313 10:33:18.001298 3972 log.go:25] "Validated CRI v1 runtime API" Mar 13 10:33:18.010869 master-0 kubenswrapper[3972]: I0313 10:33:18.010780 3972 log.go:25] "Validated CRI v1 image API" Mar 13 10:33:18.013897 master-0 kubenswrapper[3972]: I0313 10:33:18.013843 3972 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 10:33:18.019018 master-0 kubenswrapper[3972]: I0313 10:33:18.018952 3972 fs.go:135] Filesystem UUIDs: map[58e57e2d-ae5b-4324-bfe8-6d8d8bd04e58:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 13 10:33:18.019018 master-0 kubenswrapper[3972]: I0313 10:33:18.018997 3972 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 13 10:33:18.050227 master-0 kubenswrapper[3972]: I0313 10:33:18.049656 3972 manager.go:217] Machine: {Timestamp:2026-03-13 10:33:18.044149801 +0000 UTC m=+0.662266229 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3a49dbefec214e87acca6e8120215b7b SystemUUID:3a49dbef-ec21-4e87-acca-6e8120215b7b BootID:794a19f0-76ba-45e8-ae39-0211fb872ab6 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:35:d5:aa Speed:-1 Mtu:9000} {Name:ovs-system MacAddress:96:ef:67:d7:01:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 10:33:18.050227 master-0 kubenswrapper[3972]: I0313 10:33:18.050186 3972 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 10:33:18.050562 master-0 kubenswrapper[3972]: I0313 10:33:18.050469 3972 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 10:33:18.052883 master-0 kubenswrapper[3972]: I0313 10:33:18.052811 3972 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 10:33:18.053357 master-0 kubenswrapper[3972]: I0313 10:33:18.053288 3972 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 10:33:18.053819 master-0 kubenswrapper[3972]: I0313 10:33:18.053353 3972 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 10:33:18.053916 master-0 kubenswrapper[3972]: I0313 10:33:18.053876 3972 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 10:33:18.053916 master-0 kubenswrapper[3972]: I0313 10:33:18.053898 3972 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 10:33:18.054026 master-0 kubenswrapper[3972]: I0313 10:33:18.053930 3972 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:33:18.054026 master-0 kubenswrapper[3972]: I0313 10:33:18.053999 3972 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:33:18.054540 master-0 kubenswrapper[3972]: I0313 10:33:18.054492 3972 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:33:18.054716 master-0 kubenswrapper[3972]: I0313 10:33:18.054678 3972 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 10:33:18.059190 master-0 kubenswrapper[3972]: I0313 10:33:18.059144 3972 kubelet.go:418] "Attempting to sync node with API server" Mar 13 10:33:18.059190 master-0 kubenswrapper[3972]: I0313 10:33:18.059190 3972 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 10:33:18.059329 master-0 kubenswrapper[3972]: I0313 10:33:18.059270 3972 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 10:33:18.059329 master-0 kubenswrapper[3972]: I0313 10:33:18.059299 3972 kubelet.go:324] "Adding apiserver pod source" Mar 13 10:33:18.059424 master-0 kubenswrapper[3972]: I0313 10:33:18.059356 3972 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 10:33:18.067785 master-0 kubenswrapper[3972]: I0313 10:33:18.067728 3972 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 10:33:18.069578 master-0 kubenswrapper[3972]: W0313 10:33:18.069379 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:18.069780 master-0 kubenswrapper[3972]: E0313 10:33:18.069597 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:18.069780 master-0 kubenswrapper[3972]: W0313 10:33:18.069379 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:18.069780 master-0 kubenswrapper[3972]: E0313 10:33:18.069696 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:18.072730 master-0 kubenswrapper[3972]: I0313 10:33:18.072683 3972 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 10:33:18.074157 master-0 kubenswrapper[3972]: I0313 10:33:18.074126 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074163 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074183 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074200 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074217 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074233 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074251 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 10:33:18.074263 master-0 kubenswrapper[3972]: I0313 10:33:18.074267 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 10:33:18.074598 master-0 kubenswrapper[3972]: I0313 10:33:18.074288 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 10:33:18.074598 master-0 kubenswrapper[3972]: I0313 10:33:18.074309 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 10:33:18.074598 master-0 kubenswrapper[3972]: I0313 10:33:18.074354 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 10:33:18.074598 master-0 kubenswrapper[3972]: I0313 10:33:18.074376 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 10:33:18.076526 master-0 kubenswrapper[3972]: I0313 10:33:18.076481 3972 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 10:33:18.077336 master-0 kubenswrapper[3972]: I0313 10:33:18.077283 3972 server.go:1280] "Started kubelet" Mar 13 10:33:18.077915 master-0 kubenswrapper[3972]: I0313 10:33:18.077774 3972 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 10:33:18.078019 master-0 kubenswrapper[3972]: I0313 10:33:18.077766 3972 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 10:33:18.078175 master-0 kubenswrapper[3972]: I0313 10:33:18.078115 3972 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 10:33:18.078794 master-0 kubenswrapper[3972]: I0313 10:33:18.078747 3972 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 10:33:18.079040 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 10:33:18.082060 master-0 kubenswrapper[3972]: I0313 10:33:18.081982 3972 server.go:449] "Adding debug handlers to kubelet server" Mar 13 10:33:18.082431 master-0 kubenswrapper[3972]: I0313 10:33:18.082371 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:18.082965 master-0 kubenswrapper[3972]: I0313 10:33:18.082916 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 10:33:18.083155 master-0 kubenswrapper[3972]: I0313 10:33:18.083000 3972 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 10:33:18.083796 master-0 kubenswrapper[3972]: I0313 10:33:18.083761 3972 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 10:33:18.083954 master-0 kubenswrapper[3972]: E0313 10:33:18.083892 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:18.084076 master-0 kubenswrapper[3972]: I0313 10:33:18.084051 3972 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 10:33:18.084301 master-0 kubenswrapper[3972]: I0313 10:33:18.084205 3972 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 10:33:18.085876 master-0 kubenswrapper[3972]: I0313 10:33:18.085757 3972 reconstruct.go:97] "Volume reconstruction finished" Mar 13 10:33:18.085876 master-0 kubenswrapper[3972]: I0313 10:33:18.085797 3972 reconciler.go:26] "Reconciler: start to sync state" Mar 13 10:33:18.086207 master-0 kubenswrapper[3972]: E0313 10:33:18.086037 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 10:33:18.086950 master-0 kubenswrapper[3972]: W0313 10:33:18.086854 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:18.087047 master-0 kubenswrapper[3972]: E0313 10:33:18.086975 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:18.090438 master-0 kubenswrapper[3972]: E0313 10:33:18.088874 3972 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c601558c8181b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,LastTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:18.092277 master-0 kubenswrapper[3972]: I0313 10:33:18.092233 3972 factory.go:55] Registering systemd factory Mar 13 10:33:18.092358 master-0 kubenswrapper[3972]: I0313 10:33:18.092314 3972 factory.go:221] Registration of the systemd container factory successfully Mar 13 10:33:18.093172 master-0 kubenswrapper[3972]: I0313 10:33:18.093124 3972 factory.go:153] Registering CRI-O factory Mar 13 10:33:18.093172 master-0 kubenswrapper[3972]: I0313 10:33:18.093169 3972 factory.go:221] Registration of the crio container factory successfully Mar 13 10:33:18.093336 master-0 kubenswrapper[3972]: I0313 10:33:18.093297 3972 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 10:33:18.093411 master-0 kubenswrapper[3972]: I0313 10:33:18.093388 3972 factory.go:103] Registering Raw factory Mar 13 10:33:18.093497 master-0 kubenswrapper[3972]: I0313 10:33:18.093463 3972 manager.go:1196] Started watching for new ooms in manager Mar 13 10:33:18.096135 master-0 kubenswrapper[3972]: I0313 10:33:18.095853 3972 manager.go:319] Starting recovery of all containers Mar 13 10:33:18.103121 master-0 kubenswrapper[3972]: E0313 10:33:18.103057 3972 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 13 10:33:18.115052 master-0 kubenswrapper[3972]: I0313 10:33:18.114702 3972 manager.go:324] Recovery completed Mar 13 10:33:18.128186 master-0 kubenswrapper[3972]: I0313 10:33:18.128128 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.130774 master-0 kubenswrapper[3972]: I0313 10:33:18.130710 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.131022 master-0 kubenswrapper[3972]: I0313 10:33:18.130986 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.131022 master-0 kubenswrapper[3972]: I0313 10:33:18.131018 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.131974 master-0 kubenswrapper[3972]: I0313 10:33:18.131929 3972 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 10:33:18.132014 master-0 kubenswrapper[3972]: I0313 10:33:18.131969 3972 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 10:33:18.132050 master-0 kubenswrapper[3972]: I0313 10:33:18.132014 3972 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:33:18.136919 master-0 kubenswrapper[3972]: I0313 10:33:18.136867 3972 policy_none.go:49] "None policy: Start" Mar 13 10:33:18.138282 master-0 kubenswrapper[3972]: I0313 10:33:18.138239 3972 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 10:33:18.138385 master-0 kubenswrapper[3972]: I0313 10:33:18.138288 3972 state_mem.go:35] "Initializing new in-memory state store" Mar 13 10:33:18.184790 master-0 kubenswrapper[3972]: E0313 10:33:18.184717 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:18.222809 master-0 kubenswrapper[3972]: I0313 10:33:18.222720 3972 manager.go:334] "Starting Device Plugin manager" Mar 13 10:33:18.222951 master-0 kubenswrapper[3972]: I0313 10:33:18.222819 3972 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 10:33:18.222951 master-0 kubenswrapper[3972]: I0313 10:33:18.222849 3972 server.go:79] "Starting device plugin registration server" Mar 13 10:33:18.223484 master-0 kubenswrapper[3972]: I0313 10:33:18.223434 3972 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 10:33:18.223565 master-0 kubenswrapper[3972]: I0313 10:33:18.223499 3972 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 10:33:18.224153 master-0 kubenswrapper[3972]: I0313 10:33:18.224115 3972 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 10:33:18.224398 master-0 kubenswrapper[3972]: I0313 10:33:18.224224 3972 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 10:33:18.224398 master-0 kubenswrapper[3972]: I0313 10:33:18.224236 3972 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 10:33:18.227571 master-0 kubenswrapper[3972]: E0313 10:33:18.227498 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:33:18.288617 master-0 kubenswrapper[3972]: E0313 10:33:18.288524 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 10:33:18.305380 master-0 kubenswrapper[3972]: I0313 10:33:18.305261 3972 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 10:33:18.308158 master-0 kubenswrapper[3972]: I0313 10:33:18.308090 3972 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 10:33:18.308357 master-0 kubenswrapper[3972]: I0313 10:33:18.308215 3972 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 10:33:18.308357 master-0 kubenswrapper[3972]: I0313 10:33:18.308272 3972 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 10:33:18.308537 master-0 kubenswrapper[3972]: E0313 10:33:18.308362 3972 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 13 10:33:18.309846 master-0 kubenswrapper[3972]: W0313 10:33:18.309734 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:18.309980 master-0 kubenswrapper[3972]: E0313 10:33:18.309865 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:18.323946 master-0 kubenswrapper[3972]: I0313 10:33:18.323892 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.325115 master-0 kubenswrapper[3972]: I0313 10:33:18.325048 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.325224 master-0 kubenswrapper[3972]: I0313 10:33:18.325126 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.325224 master-0 kubenswrapper[3972]: I0313 10:33:18.325144 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.325224 master-0 kubenswrapper[3972]: I0313 10:33:18.325182 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:18.326467 master-0 kubenswrapper[3972]: E0313 10:33:18.326398 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:18.409004 master-0 kubenswrapper[3972]: I0313 10:33:18.408621 3972 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 13 10:33:18.409004 master-0 kubenswrapper[3972]: I0313 10:33:18.408916 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.411156 master-0 kubenswrapper[3972]: I0313 10:33:18.411008 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.411156 master-0 kubenswrapper[3972]: I0313 10:33:18.411074 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.411156 master-0 kubenswrapper[3972]: I0313 10:33:18.411131 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.411584 master-0 kubenswrapper[3972]: I0313 10:33:18.411443 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.412049 master-0 kubenswrapper[3972]: I0313 10:33:18.411982 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.412170 master-0 kubenswrapper[3972]: I0313 10:33:18.412087 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.412643 master-0 kubenswrapper[3972]: I0313 10:33:18.412577 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.412739 master-0 kubenswrapper[3972]: I0313 10:33:18.412646 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.412739 master-0 kubenswrapper[3972]: I0313 10:33:18.412670 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.412882 master-0 kubenswrapper[3972]: I0313 10:33:18.412834 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.413252 master-0 kubenswrapper[3972]: I0313 10:33:18.413152 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.413402 master-0 kubenswrapper[3972]: I0313 10:33:18.413241 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.413533 master-0 kubenswrapper[3972]: I0313 10:33:18.413487 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.413631 master-0 kubenswrapper[3972]: I0313 10:33:18.413546 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.413631 master-0 kubenswrapper[3972]: I0313 10:33:18.413567 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.413963 master-0 kubenswrapper[3972]: I0313 10:33:18.413865 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.413963 master-0 kubenswrapper[3972]: I0313 10:33:18.413964 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.414144 master-0 kubenswrapper[3972]: I0313 10:33:18.413983 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.414223 master-0 kubenswrapper[3972]: I0313 10:33:18.414209 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.414895 master-0 kubenswrapper[3972]: I0313 10:33:18.414374 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.414895 master-0 kubenswrapper[3972]: I0313 10:33:18.414442 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.414895 master-0 kubenswrapper[3972]: I0313 10:33:18.414599 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.414895 master-0 kubenswrapper[3972]: I0313 10:33:18.414659 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.414895 master-0 kubenswrapper[3972]: I0313 10:33:18.414678 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.415542 master-0 kubenswrapper[3972]: I0313 10:33:18.415319 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.415542 master-0 kubenswrapper[3972]: I0313 10:33:18.415382 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.415542 master-0 kubenswrapper[3972]: I0313 10:33:18.415404 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.415822 master-0 kubenswrapper[3972]: I0313 10:33:18.415609 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.415822 master-0 kubenswrapper[3972]: I0313 10:33:18.415801 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.416025 master-0 kubenswrapper[3972]: I0313 10:33:18.415857 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.416803 master-0 kubenswrapper[3972]: I0313 10:33:18.416736 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.416803 master-0 kubenswrapper[3972]: I0313 10:33:18.416778 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.417043 master-0 kubenswrapper[3972]: I0313 10:33:18.416815 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.417043 master-0 kubenswrapper[3972]: I0313 10:33:18.416917 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.417043 master-0 kubenswrapper[3972]: I0313 10:33:18.417037 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.417338 master-0 kubenswrapper[3972]: I0313 10:33:18.417059 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.417338 master-0 kubenswrapper[3972]: I0313 10:33:18.416782 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.417338 master-0 kubenswrapper[3972]: I0313 10:33:18.416997 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.417338 master-0 kubenswrapper[3972]: I0313 10:33:18.417169 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.417796 master-0 kubenswrapper[3972]: I0313 10:33:18.417638 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.417796 master-0 kubenswrapper[3972]: I0313 10:33:18.417705 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.419026 master-0 kubenswrapper[3972]: I0313 10:33:18.418966 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.419026 master-0 kubenswrapper[3972]: I0313 10:33:18.419023 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.419251 master-0 kubenswrapper[3972]: I0313 10:33:18.419053 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.487422 master-0 kubenswrapper[3972]: I0313 10:33:18.487328 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.487422 master-0 kubenswrapper[3972]: I0313 10:33:18.487417 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.487745 master-0 kubenswrapper[3972]: I0313 10:33:18.487457 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.487745 master-0 kubenswrapper[3972]: I0313 10:33:18.487490 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.487745 master-0 kubenswrapper[3972]: I0313 10:33:18.487524 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.487745 master-0 kubenswrapper[3972]: I0313 10:33:18.487558 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487742 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487812 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487880 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487924 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487957 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.487989 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488078 master-0 kubenswrapper[3972]: I0313 10:33:18.488022 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488657 master-0 kubenswrapper[3972]: I0313 10:33:18.488153 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488657 master-0 kubenswrapper[3972]: I0313 10:33:18.488244 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.488657 master-0 kubenswrapper[3972]: I0313 10:33:18.488293 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.488657 master-0 kubenswrapper[3972]: I0313 10:33:18.488347 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.526733 master-0 kubenswrapper[3972]: I0313 10:33:18.526675 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.528631 master-0 kubenswrapper[3972]: I0313 10:33:18.528548 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.528631 master-0 kubenswrapper[3972]: I0313 10:33:18.528634 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.528894 master-0 kubenswrapper[3972]: I0313 10:33:18.528848 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.529143 master-0 kubenswrapper[3972]: I0313 10:33:18.529061 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:18.530595 master-0 kubenswrapper[3972]: E0313 10:33:18.530523 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:18.589668 master-0 kubenswrapper[3972]: I0313 10:33:18.589562 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.589668 master-0 kubenswrapper[3972]: I0313 10:33:18.589652 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.589668 master-0 kubenswrapper[3972]: I0313 10:33:18.589688 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.590135 master-0 kubenswrapper[3972]: I0313 10:33:18.589989 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.590216 master-0 kubenswrapper[3972]: I0313 10:33:18.590173 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.590346 master-0 kubenswrapper[3972]: I0313 10:33:18.590294 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590469 master-0 kubenswrapper[3972]: I0313 10:33:18.590356 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.590469 master-0 kubenswrapper[3972]: I0313 10:33:18.590365 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590469 master-0 kubenswrapper[3972]: I0313 10:33:18.590305 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590471 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590479 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590518 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590517 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590538 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590589 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590622 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590633 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.590635 master-0 kubenswrapper[3972]: I0313 10:33:18.590629 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590591 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590666 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590711 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590716 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590731 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590751 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590803 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590813 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590854 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590867 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590905 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590911 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590955 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590972 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.590978 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.591557 master-0 kubenswrapper[3972]: I0313 10:33:18.591074 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.690737 master-0 kubenswrapper[3972]: E0313 10:33:18.690490 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 10:33:18.748032 master-0 kubenswrapper[3972]: I0313 10:33:18.747910 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:33:18.771907 master-0 kubenswrapper[3972]: I0313 10:33:18.771819 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:18.791558 master-0 kubenswrapper[3972]: I0313 10:33:18.791444 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:33:18.812311 master-0 kubenswrapper[3972]: I0313 10:33:18.812199 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:18.825209 master-0 kubenswrapper[3972]: I0313 10:33:18.825140 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:33:18.931217 master-0 kubenswrapper[3972]: I0313 10:33:18.931126 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:18.932542 master-0 kubenswrapper[3972]: I0313 10:33:18.932492 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:18.932596 master-0 kubenswrapper[3972]: I0313 10:33:18.932555 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:18.932596 master-0 kubenswrapper[3972]: I0313 10:33:18.932574 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:18.932693 master-0 kubenswrapper[3972]: I0313 10:33:18.932656 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:18.933837 master-0 kubenswrapper[3972]: E0313 10:33:18.933774 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:19.042474 master-0 kubenswrapper[3972]: W0313 10:33:19.042254 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:19.042474 master-0 kubenswrapper[3972]: E0313 10:33:19.042421 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:19.070020 master-0 kubenswrapper[3972]: W0313 10:33:19.069881 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:19.070020 master-0 kubenswrapper[3972]: E0313 10:33:19.069995 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:19.084896 master-0 kubenswrapper[3972]: I0313 10:33:19.084795 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:19.507822 master-0 kubenswrapper[3972]: E0313 10:33:19.507730 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 10:33:19.528385 master-0 kubenswrapper[3972]: W0313 10:33:19.528284 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0 WatchSource:0}: Error finding container e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0: Status 404 returned error can't find the container with id e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0 Mar 13 10:33:19.534210 master-0 kubenswrapper[3972]: I0313 10:33:19.534163 3972 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:33:19.541074 master-0 kubenswrapper[3972]: W0313 10:33:19.541042 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f WatchSource:0}: Error finding container 06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f: Status 404 returned error can't find the container with id 06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f Mar 13 10:33:19.560198 master-0 kubenswrapper[3972]: W0313 10:33:19.560166 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84 WatchSource:0}: Error finding container cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84: Status 404 returned error can't find the container with id cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84 Mar 13 10:33:19.617262 master-0 kubenswrapper[3972]: W0313 10:33:19.617214 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c WatchSource:0}: Error finding container e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c: Status 404 returned error can't find the container with id e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c Mar 13 10:33:19.635642 master-0 kubenswrapper[3972]: W0313 10:33:19.635486 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:19.635870 master-0 kubenswrapper[3972]: E0313 10:33:19.635661 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:19.734999 master-0 kubenswrapper[3972]: I0313 10:33:19.734883 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:19.737442 master-0 kubenswrapper[3972]: I0313 10:33:19.737388 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:19.737517 master-0 kubenswrapper[3972]: I0313 10:33:19.737455 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:19.737517 master-0 kubenswrapper[3972]: I0313 10:33:19.737475 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:19.737689 master-0 kubenswrapper[3972]: I0313 10:33:19.737654 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:19.738938 master-0 kubenswrapper[3972]: E0313 10:33:19.738878 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:19.780786 master-0 kubenswrapper[3972]: W0313 10:33:19.780601 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:19.780786 master-0 kubenswrapper[3972]: E0313 10:33:19.780709 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:20.044752 master-0 kubenswrapper[3972]: I0313 10:33:20.044467 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 10:33:20.046933 master-0 kubenswrapper[3972]: E0313 10:33:20.046838 3972 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:20.084323 master-0 kubenswrapper[3972]: I0313 10:33:20.084227 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:20.318302 master-0 kubenswrapper[3972]: I0313 10:33:20.317843 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4"} Mar 13 10:33:20.319789 master-0 kubenswrapper[3972]: I0313 10:33:20.319648 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c"} Mar 13 10:33:20.321798 master-0 kubenswrapper[3972]: I0313 10:33:20.321764 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84"} Mar 13 10:33:20.323668 master-0 kubenswrapper[3972]: I0313 10:33:20.323588 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f"} Mar 13 10:33:20.325024 master-0 kubenswrapper[3972]: I0313 10:33:20.324990 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0"} Mar 13 10:33:21.085084 master-0 kubenswrapper[3972]: I0313 10:33:21.085028 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:21.109275 master-0 kubenswrapper[3972]: E0313 10:33:21.109219 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 10:33:21.223698 master-0 kubenswrapper[3972]: W0313 10:33:21.223658 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:21.223859 master-0 kubenswrapper[3972]: E0313 10:33:21.223715 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:21.340079 master-0 kubenswrapper[3972]: I0313 10:33:21.339947 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:21.342009 master-0 kubenswrapper[3972]: I0313 10:33:21.341326 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:21.342009 master-0 kubenswrapper[3972]: I0313 10:33:21.341366 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:21.342009 master-0 kubenswrapper[3972]: I0313 10:33:21.341382 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:21.342009 master-0 kubenswrapper[3972]: I0313 10:33:21.341491 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:21.342494 master-0 kubenswrapper[3972]: E0313 10:33:21.342451 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:21.469025 master-0 kubenswrapper[3972]: W0313 10:33:21.468947 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:21.469255 master-0 kubenswrapper[3972]: E0313 10:33:21.469036 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:22.073398 master-0 kubenswrapper[3972]: W0313 10:33:22.073331 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:22.073613 master-0 kubenswrapper[3972]: E0313 10:33:22.073407 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:22.083830 master-0 kubenswrapper[3972]: I0313 10:33:22.083752 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:22.312844 master-0 kubenswrapper[3972]: W0313 10:33:22.312707 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:22.312844 master-0 kubenswrapper[3972]: E0313 10:33:22.312782 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:23.084467 master-0 kubenswrapper[3972]: I0313 10:33:23.084401 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:23.441293 master-0 kubenswrapper[3972]: E0313 10:33:23.441052 3972 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189c601558c8181b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,LastTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:24.083832 master-0 kubenswrapper[3972]: I0313 10:33:24.083580 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:24.311921 master-0 kubenswrapper[3972]: E0313 10:33:24.311864 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 13 10:33:24.335834 master-0 kubenswrapper[3972]: I0313 10:33:24.335711 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"fc59a335ab92b5426116aa2f5adb31266760392f014df421d723f95bb6f6ebfb"} Mar 13 10:33:24.397646 master-0 kubenswrapper[3972]: I0313 10:33:24.397589 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 10:33:24.401236 master-0 kubenswrapper[3972]: E0313 10:33:24.401196 3972 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:24.543377 master-0 kubenswrapper[3972]: I0313 10:33:24.543300 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:24.544819 master-0 kubenswrapper[3972]: I0313 10:33:24.544771 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:24.544911 master-0 kubenswrapper[3972]: I0313 10:33:24.544863 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:24.544911 master-0 kubenswrapper[3972]: I0313 10:33:24.544902 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:24.545054 master-0 kubenswrapper[3972]: I0313 10:33:24.545031 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:24.545870 master-0 kubenswrapper[3972]: E0313 10:33:24.545824 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 13 10:33:24.658693 master-0 kubenswrapper[3972]: W0313 10:33:24.657708 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:24.658693 master-0 kubenswrapper[3972]: E0313 10:33:24.657839 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:25.084526 master-0 kubenswrapper[3972]: I0313 10:33:25.084391 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:25.340911 master-0 kubenswrapper[3972]: I0313 10:33:25.340759 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"2b53706ef774eb15c126f57be58e4c0c9f005142fd0e9af295b43871ae8de7ef"} Mar 13 10:33:25.341123 master-0 kubenswrapper[3972]: I0313 10:33:25.340926 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:25.341750 master-0 kubenswrapper[3972]: I0313 10:33:25.341712 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:25.341805 master-0 kubenswrapper[3972]: I0313 10:33:25.341752 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:25.341805 master-0 kubenswrapper[3972]: I0313 10:33:25.341774 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:26.084716 master-0 kubenswrapper[3972]: I0313 10:33:26.084641 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:26.343500 master-0 kubenswrapper[3972]: I0313 10:33:26.343361 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:26.344687 master-0 kubenswrapper[3972]: I0313 10:33:26.344627 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:26.344759 master-0 kubenswrapper[3972]: I0313 10:33:26.344746 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:26.344797 master-0 kubenswrapper[3972]: I0313 10:33:26.344766 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:26.642073 master-0 kubenswrapper[3972]: W0313 10:33:26.642006 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:26.642297 master-0 kubenswrapper[3972]: E0313 10:33:26.642110 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:27.084146 master-0 kubenswrapper[3972]: I0313 10:33:27.084013 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:27.436836 master-0 kubenswrapper[3972]: W0313 10:33:27.436731 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:27.437487 master-0 kubenswrapper[3972]: E0313 10:33:27.436865 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:28.083761 master-0 kubenswrapper[3972]: I0313 10:33:28.083717 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:28.227740 master-0 kubenswrapper[3972]: E0313 10:33:28.227676 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:33:28.282673 master-0 kubenswrapper[3972]: W0313 10:33:28.282611 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:28.282860 master-0 kubenswrapper[3972]: E0313 10:33:28.282683 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 13 10:33:29.084877 master-0 kubenswrapper[3972]: I0313 10:33:29.084772 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:33:29.353634 master-0 kubenswrapper[3972]: I0313 10:33:29.353503 3972 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6" exitCode=0 Mar 13 10:33:29.353910 master-0 kubenswrapper[3972]: I0313 10:33:29.353653 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:29.353910 master-0 kubenswrapper[3972]: I0313 10:33:29.353659 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6"} Mar 13 10:33:29.354891 master-0 kubenswrapper[3972]: I0313 10:33:29.354859 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:29.354971 master-0 kubenswrapper[3972]: I0313 10:33:29.354908 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:29.354971 master-0 kubenswrapper[3972]: I0313 10:33:29.354923 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:29.357675 master-0 kubenswrapper[3972]: I0313 10:33:29.357628 3972 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34" exitCode=1 Mar 13 10:33:29.357804 master-0 kubenswrapper[3972]: I0313 10:33:29.357739 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34"} Mar 13 10:33:29.359372 master-0 kubenswrapper[3972]: I0313 10:33:29.359339 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:29.359372 master-0 kubenswrapper[3972]: I0313 10:33:29.359343 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569"} Mar 13 10:33:29.360064 master-0 kubenswrapper[3972]: I0313 10:33:29.360022 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:29.360064 master-0 kubenswrapper[3972]: I0313 10:33:29.360061 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:29.360230 master-0 kubenswrapper[3972]: I0313 10:33:29.360080 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:29.360960 master-0 kubenswrapper[3972]: I0313 10:33:29.360917 3972 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1" exitCode=0 Mar 13 10:33:29.360960 master-0 kubenswrapper[3972]: I0313 10:33:29.360950 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1"} Mar 13 10:33:29.361135 master-0 kubenswrapper[3972]: I0313 10:33:29.360981 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:29.361643 master-0 kubenswrapper[3972]: I0313 10:33:29.361605 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:29.361643 master-0 kubenswrapper[3972]: I0313 10:33:29.361643 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:29.361836 master-0 kubenswrapper[3972]: I0313 10:33:29.361658 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:29.364197 master-0 kubenswrapper[3972]: I0313 10:33:29.364176 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:29.364866 master-0 kubenswrapper[3972]: I0313 10:33:29.364781 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:29.364866 master-0 kubenswrapper[3972]: I0313 10:33:29.364813 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:29.364866 master-0 kubenswrapper[3972]: I0313 10:33:29.364829 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:30.363887 master-0 kubenswrapper[3972]: I0313 10:33:30.363760 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 13 10:33:30.364817 master-0 kubenswrapper[3972]: I0313 10:33:30.364179 3972 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="723879e9622893253609dc64b27d1e09b7d5e4c5398c0053b730570d305069ca" exitCode=1 Mar 13 10:33:30.364817 master-0 kubenswrapper[3972]: I0313 10:33:30.364260 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"723879e9622893253609dc64b27d1e09b7d5e4c5398c0053b730570d305069ca"} Mar 13 10:33:30.364817 master-0 kubenswrapper[3972]: I0313 10:33:30.364411 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:30.365749 master-0 kubenswrapper[3972]: I0313 10:33:30.365683 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:30.365871 master-0 kubenswrapper[3972]: I0313 10:33:30.365749 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:30.365871 master-0 kubenswrapper[3972]: I0313 10:33:30.365775 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:30.366271 master-0 kubenswrapper[3972]: I0313 10:33:30.366199 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba"} Mar 13 10:33:30.366344 master-0 kubenswrapper[3972]: I0313 10:33:30.366226 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:30.366417 master-0 kubenswrapper[3972]: I0313 10:33:30.366382 3972 scope.go:117] "RemoveContainer" containerID="723879e9622893253609dc64b27d1e09b7d5e4c5398c0053b730570d305069ca" Mar 13 10:33:30.368021 master-0 kubenswrapper[3972]: I0313 10:33:30.367537 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:30.368021 master-0 kubenswrapper[3972]: I0313 10:33:30.367569 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:30.368021 master-0 kubenswrapper[3972]: I0313 10:33:30.367579 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:30.946373 master-0 kubenswrapper[3972]: I0313 10:33:30.946309 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:30.948632 master-0 kubenswrapper[3972]: I0313 10:33:30.948596 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:30.948632 master-0 kubenswrapper[3972]: I0313 10:33:30.948628 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:30.948632 master-0 kubenswrapper[3972]: I0313 10:33:30.948637 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:30.948794 master-0 kubenswrapper[3972]: I0313 10:33:30.948684 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:31.191170 master-0 kubenswrapper[3972]: E0313 10:33:31.191107 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 10:33:31.191430 master-0 kubenswrapper[3972]: E0313 10:33:31.191338 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 10:33:31.191430 master-0 kubenswrapper[3972]: I0313 10:33:31.191392 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.377172 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.377805 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28"} Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.377917 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.378601 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.378620 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:31.392133 master-0 kubenswrapper[3972]: I0313 10:33:31.378628 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:32.084757 master-0 kubenswrapper[3972]: I0313 10:33:32.084635 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:32.382779 master-0 kubenswrapper[3972]: I0313 10:33:32.382677 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:32.382974 master-0 kubenswrapper[3972]: I0313 10:33:32.382826 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437"} Mar 13 10:33:32.389590 master-0 kubenswrapper[3972]: I0313 10:33:32.389536 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:32.389590 master-0 kubenswrapper[3972]: I0313 10:33:32.389583 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:32.389590 master-0 kubenswrapper[3972]: I0313 10:33:32.389595 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:32.389962 master-0 kubenswrapper[3972]: I0313 10:33:32.389935 3972 scope.go:117] "RemoveContainer" containerID="6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34" Mar 13 10:33:32.390989 master-0 kubenswrapper[3972]: I0313 10:33:32.390889 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 10:33:32.391967 master-0 kubenswrapper[3972]: I0313 10:33:32.391933 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 13 10:33:32.392520 master-0 kubenswrapper[3972]: I0313 10:33:32.392483 3972 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28" exitCode=1 Mar 13 10:33:32.392900 master-0 kubenswrapper[3972]: I0313 10:33:32.392528 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28"} Mar 13 10:33:32.392900 master-0 kubenswrapper[3972]: I0313 10:33:32.392603 3972 scope.go:117] "RemoveContainer" containerID="723879e9622893253609dc64b27d1e09b7d5e4c5398c0053b730570d305069ca" Mar 13 10:33:32.392900 master-0 kubenswrapper[3972]: I0313 10:33:32.392639 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:32.393582 master-0 kubenswrapper[3972]: I0313 10:33:32.393550 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:32.393582 master-0 kubenswrapper[3972]: I0313 10:33:32.393578 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:32.393682 master-0 kubenswrapper[3972]: I0313 10:33:32.393589 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:32.394358 master-0 kubenswrapper[3972]: I0313 10:33:32.394338 3972 scope.go:117] "RemoveContainer" containerID="4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28" Mar 13 10:33:32.394572 master-0 kubenswrapper[3972]: E0313 10:33:32.394537 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 10:33:32.736124 master-0 kubenswrapper[3972]: I0313 10:33:32.736036 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 13 10:33:32.759935 master-0 kubenswrapper[3972]: I0313 10:33:32.759463 3972 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 10:33:33.088753 master-0 kubenswrapper[3972]: I0313 10:33:33.088586 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:33.329755 master-0 kubenswrapper[3972]: W0313 10:33:33.329676 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:33.329755 master-0 kubenswrapper[3972]: E0313 10:33:33.329759 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:33.397532 master-0 kubenswrapper[3972]: I0313 10:33:33.397445 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d"} Mar 13 10:33:33.398176 master-0 kubenswrapper[3972]: I0313 10:33:33.397618 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:33.398543 master-0 kubenswrapper[3972]: I0313 10:33:33.398509 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:33.398695 master-0 kubenswrapper[3972]: I0313 10:33:33.398552 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:33.398695 master-0 kubenswrapper[3972]: I0313 10:33:33.398569 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:33.401505 master-0 kubenswrapper[3972]: I0313 10:33:33.401446 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 10:33:33.405548 master-0 kubenswrapper[3972]: I0313 10:33:33.405503 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:33.406501 master-0 kubenswrapper[3972]: I0313 10:33:33.406460 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:33.406501 master-0 kubenswrapper[3972]: I0313 10:33:33.406501 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:33.406608 master-0 kubenswrapper[3972]: I0313 10:33:33.406516 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:33.406943 master-0 kubenswrapper[3972]: I0313 10:33:33.406905 3972 scope.go:117] "RemoveContainer" containerID="4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28" Mar 13 10:33:33.407161 master-0 kubenswrapper[3972]: E0313 10:33:33.407117 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 10:33:33.423004 master-0 kubenswrapper[3972]: I0313 10:33:33.422953 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984"} Mar 13 10:33:33.423119 master-0 kubenswrapper[3972]: I0313 10:33:33.423051 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:33.423861 master-0 kubenswrapper[3972]: I0313 10:33:33.423799 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:33.423903 master-0 kubenswrapper[3972]: I0313 10:33:33.423870 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:33.423903 master-0 kubenswrapper[3972]: I0313 10:33:33.423886 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:33.447520 master-0 kubenswrapper[3972]: E0313 10:33:33.447359 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c601558c8181b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,LastTimestamp:2026-03-13 10:33:18.077220891 +0000 UTC m=+0.695337319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.452033 master-0 kubenswrapper[3972]: E0313 10:33:33.451869 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.458689 master-0 kubenswrapper[3972]: E0313 10:33:33.458558 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.464212 master-0 kubenswrapper[3972]: E0313 10:33:33.464005 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.469011 master-0 kubenswrapper[3972]: E0313 10:33:33.468866 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c601561e5bb7e default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.230158206 +0000 UTC m=+0.848274634,LastTimestamp:2026-03-13 10:33:18.230158206 +0000 UTC m=+0.848274634,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.474941 master-0 kubenswrapper[3972]: E0313 10:33:33.474788 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.325082635 +0000 UTC m=+0.943199063,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.479274 master-0 kubenswrapper[3972]: E0313 10:33:33.479134 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.325138616 +0000 UTC m=+0.943255034,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.484425 master-0 kubenswrapper[3972]: E0313 10:33:33.484298 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.325153216 +0000 UTC m=+0.943269634,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.489897 master-0 kubenswrapper[3972]: E0313 10:33:33.489756 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.411046435 +0000 UTC m=+1.029162863,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.496001 master-0 kubenswrapper[3972]: E0313 10:33:33.495842 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.411088816 +0000 UTC m=+1.029205234,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.501285 master-0 kubenswrapper[3972]: E0313 10:33:33.501042 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.411146667 +0000 UTC m=+1.029263095,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.515275 master-0 kubenswrapper[3972]: E0313 10:33:33.515063 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.412620651 +0000 UTC m=+1.030737079,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.524541 master-0 kubenswrapper[3972]: E0313 10:33:33.524372 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.412662642 +0000 UTC m=+1.030779070,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.531028 master-0 kubenswrapper[3972]: E0313 10:33:33.530881 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.412683722 +0000 UTC m=+1.030800150,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.536590 master-0 kubenswrapper[3972]: E0313 10:33:33.536447 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.413524486 +0000 UTC m=+1.031640924,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.542731 master-0 kubenswrapper[3972]: E0313 10:33:33.542588 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.413559867 +0000 UTC m=+1.031676285,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.549596 master-0 kubenswrapper[3972]: E0313 10:33:33.549422 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.413579127 +0000 UTC m=+1.031695555,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.555333 master-0 kubenswrapper[3972]: E0313 10:33:33.555208 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.413941083 +0000 UTC m=+1.032057511,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.560047 master-0 kubenswrapper[3972]: E0313 10:33:33.559880 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.413976804 +0000 UTC m=+1.032093232,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.565305 master-0 kubenswrapper[3972]: E0313 10:33:33.565130 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.413993744 +0000 UTC m=+1.032110162,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.569742 master-0 kubenswrapper[3972]: E0313 10:33:33.569584 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.414633455 +0000 UTC m=+1.032749883,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.574699 master-0 kubenswrapper[3972]: E0313 10:33:33.574566 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.414670665 +0000 UTC m=+1.032787093,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.578515 master-0 kubenswrapper[3972]: E0313 10:33:33.578439 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfd2257\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfd2257 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131028567 +0000 UTC m=+0.749144985,LastTimestamp:2026-03-13 10:33:18.414687356 +0000 UTC m=+1.032803784,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.582497 master-0 kubenswrapper[3972]: E0313 10:33:33.582361 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfc0db1\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfc0db1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.130957745 +0000 UTC m=+0.749074183,LastTimestamp:2026-03-13 10:33:18.415349057 +0000 UTC m=+1.033465485,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.587371 master-0 kubenswrapper[3972]: E0313 10:33:33.587244 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189c60155bfcde86\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189c60155bfcde86 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:18.131011206 +0000 UTC m=+0.749127634,LastTimestamp:2026-03-13 10:33:18.415396017 +0000 UTC m=+1.033512445,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.592394 master-0 kubenswrapper[3972]: E0313 10:33:33.592303 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6015af9e1d0f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:19.534087439 +0000 UTC m=+2.152203827,LastTimestamp:2026-03-13 10:33:19.534087439 +0000 UTC m=+2.152203827,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.595996 master-0 kubenswrapper[3972]: E0313 10:33:33.595883 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c6015b06b3f92 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:19.547531154 +0000 UTC m=+2.165647542,LastTimestamp:2026-03-13 10:33:19.547531154 +0000 UTC m=+2.165647542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.599784 master-0 kubenswrapper[3972]: E0313 10:33:33.599682 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6015b14e2263 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:19.562400355 +0000 UTC m=+2.180516753,LastTimestamp:2026-03-13 10:33:19.562400355 +0000 UTC m=+2.180516753,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.603384 master-0 kubenswrapper[3972]: E0313 10:33:33.603308 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6015b4c14989 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:19.620278665 +0000 UTC m=+2.238395063,LastTimestamp:2026-03-13 10:33:19.620278665 +0000 UTC m=+2.238395063,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.607253 master-0 kubenswrapper[3972]: E0313 10:33:33.607079 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6015b83af902 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:19.67858509 +0000 UTC m=+2.296701478,LastTimestamp:2026-03-13 10:33:19.67858509 +0000 UTC m=+2.296701478,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.611982 master-0 kubenswrapper[3972]: E0313 10:33:33.611861 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016b5217179 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 4.301s (4.301s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:23.921547641 +0000 UTC m=+6.539664039,LastTimestamp:2026-03-13 10:33:23.921547641 +0000 UTC m=+6.539664039,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.615687 master-0 kubenswrapper[3972]: E0313 10:33:33.615560 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016c1db32c9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:24.135047881 +0000 UTC m=+6.753164269,LastTimestamp:2026-03-13 10:33:24.135047881 +0000 UTC m=+6.753164269,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.619621 master-0 kubenswrapper[3972]: E0313 10:33:33.619413 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016c2939db9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:24.147133881 +0000 UTC m=+6.765250259,LastTimestamp:2026-03-13 10:33:24.147133881 +0000 UTC m=+6.765250259,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.625925 master-0 kubenswrapper[3972]: E0313 10:33:33.625807 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016c2baad81 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:24.149693825 +0000 UTC m=+6.767810213,LastTimestamp:2026-03-13 10:33:24.149693825 +0000 UTC m=+6.767810213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.629711 master-0 kubenswrapper[3972]: E0313 10:33:33.629587 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016cde31f6f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:24.336893807 +0000 UTC m=+6.955010195,LastTimestamp:2026-03-13 10:33:24.336893807 +0000 UTC m=+6.955010195,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.633501 master-0 kubenswrapper[3972]: E0313 10:33:33.633376 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6016ceb58464 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:24.350682212 +0000 UTC m=+6.968798600,LastTimestamp:2026-03-13 10:33:24.350682212 +0000 UTC m=+6.968798600,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.637594 master-0 kubenswrapper[3972]: E0313 10:33:33.637456 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017bb8a08c0 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 8.645s (8.645s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.324032704 +0000 UTC m=+10.942149112,LastTimestamp:2026-03-13 10:33:28.324032704 +0000 UTC m=+10.942149112,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.641388 master-0 kubenswrapper[3972]: E0313 10:33:33.641217 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c6017bc33e59f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 8.787s (8.787s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.335164831 +0000 UTC m=+10.953281219,LastTimestamp:2026-03-13 10:33:28.335164831 +0000 UTC m=+10.953281219,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.646189 master-0 kubenswrapper[3972]: E0313 10:33:33.646016 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017be8b3bce kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 8.811s (8.811s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.374442958 +0000 UTC m=+10.992559356,LastTimestamp:2026-03-13 10:33:28.374442958 +0000 UTC m=+10.992559356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.650218 master-0 kubenswrapper[3972]: E0313 10:33:33.650064 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6017c1307d86 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 8.884s (8.884s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.418827654 +0000 UTC m=+11.036944042,LastTimestamp:2026-03-13 10:33:28.418827654 +0000 UTC m=+11.036944042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.654166 master-0 kubenswrapper[3972]: E0313 10:33:33.654014 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017c5739678 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.490333816 +0000 UTC m=+11.108450204,LastTimestamp:2026-03-13 10:33:28.490333816 +0000 UTC m=+11.108450204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.657995 master-0 kubenswrapper[3972]: E0313 10:33:33.657873 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c6017c63df56a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.503596394 +0000 UTC m=+11.121712782,LastTimestamp:2026-03-13 10:33:28.503596394 +0000 UTC m=+11.121712782,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.662585 master-0 kubenswrapper[3972]: E0313 10:33:33.662413 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189c6017c6e6589e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.514631838 +0000 UTC m=+11.132748226,LastTimestamp:2026-03-13 10:33:28.514631838 +0000 UTC m=+11.132748226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.666994 master-0 kubenswrapper[3972]: E0313 10:33:33.666886 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017cc0ebbf2 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.601164786 +0000 UTC m=+11.219281194,LastTimestamp:2026-03-13 10:33:28.601164786 +0000 UTC m=+11.219281194,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.671079 master-0 kubenswrapper[3972]: E0313 10:33:33.670962 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6017ccb2bb99 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.611912601 +0000 UTC m=+11.230028999,LastTimestamp:2026-03-13 10:33:28.611912601 +0000 UTC m=+11.230028999,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.675272 master-0 kubenswrapper[3972]: E0313 10:33:33.675118 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017ccb348a8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.611948712 +0000 UTC m=+11.230065100,LastTimestamp:2026-03-13 10:33:28.611948712 +0000 UTC m=+11.230065100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.679735 master-0 kubenswrapper[3972]: E0313 10:33:33.679594 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017ccc78e66 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.613277286 +0000 UTC m=+11.231393674,LastTimestamp:2026-03-13 10:33:28.613277286 +0000 UTC m=+11.231393674,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.684071 master-0 kubenswrapper[3972]: E0313 10:33:33.683940 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6017cd57a0e7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.622719207 +0000 UTC m=+11.240835595,LastTimestamp:2026-03-13 10:33:28.622719207 +0000 UTC m=+11.240835595,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.687906 master-0 kubenswrapper[3972]: E0313 10:33:33.687752 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017e45894a4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.008657572 +0000 UTC m=+11.626773960,LastTimestamp:2026-03-13 10:33:29.008657572 +0000 UTC m=+11.626773960,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.692682 master-0 kubenswrapper[3972]: E0313 10:33:33.692581 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017f93245ca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.358468554 +0000 UTC m=+11.976584972,LastTimestamp:2026-03-13 10:33:29.358468554 +0000 UTC m=+11.976584972,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.699409 master-0 kubenswrapper[3972]: E0313 10:33:33.699250 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6017f988a679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.364129401 +0000 UTC m=+11.982245789,LastTimestamp:2026-03-13 10:33:29.364129401 +0000 UTC m=+11.982245789,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.704767 master-0 kubenswrapper[3972]: E0313 10:33:33.704651 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6018041dac22 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.541667874 +0000 UTC m=+12.159784262,LastTimestamp:2026-03-13 10:33:29.541667874 +0000 UTC m=+12.159784262,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.709305 master-0 kubenswrapper[3972]: E0313 10:33:33.709199 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804439b50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.544153936 +0000 UTC m=+12.162270324,LastTimestamp:2026-03-13 10:33:29.544153936 +0000 UTC m=+12.162270324,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.713046 master-0 kubenswrapper[3972]: E0313 10:33:33.712980 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c601804a851b0 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.550754224 +0000 UTC m=+12.168870612,LastTimestamp:2026-03-13 10:33:29.550754224 +0000 UTC m=+12.168870612,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.715959 master-0 kubenswrapper[3972]: E0313 10:33:33.715897 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c601804b88459 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.551815769 +0000 UTC m=+12.169932167,LastTimestamp:2026-03-13 10:33:29.551815769 +0000 UTC m=+12.169932167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.720951 master-0 kubenswrapper[3972]: E0313 10:33:33.720764 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804d62cab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.553759403 +0000 UTC m=+12.171875801,LastTimestamp:2026-03-13 10:33:29.553759403 +0000 UTC m=+12.171875801,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.725557 master-0 kubenswrapper[3972]: E0313 10:33:33.725468 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c6017f93245ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017f93245ca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.358468554 +0000 UTC m=+11.976584972,LastTimestamp:2026-03-13 10:33:30.369167268 +0000 UTC m=+12.987283656,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.729016 master-0 kubenswrapper[3972]: I0313 10:33:33.728982 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:33.730242 master-0 kubenswrapper[3972]: E0313 10:33:33.730033 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c601804439b50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804439b50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.544153936 +0000 UTC m=+12.162270324,LastTimestamp:2026-03-13 10:33:30.73324811 +0000 UTC m=+13.351364498,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.733529 master-0 kubenswrapper[3972]: I0313 10:33:33.733480 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:33.734392 master-0 kubenswrapper[3972]: E0313 10:33:33.734326 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c601804d62cab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804d62cab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.553759403 +0000 UTC m=+12.171875801,LastTimestamp:2026-03-13 10:33:30.747647928 +0000 UTC m=+13.365764316,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.738759 master-0 kubenswrapper[3972]: E0313 10:33:33.738683 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c601882a6cdfd kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 3.051s (3.051s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:31.664584189 +0000 UTC m=+14.282700617,LastTimestamp:2026-03-13 10:33:31.664584189 +0000 UTC m=+14.282700617,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.743416 master-0 kubenswrapper[3972]: E0313 10:33:33.743246 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c60188d77e4da kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:31.846059226 +0000 UTC m=+14.464175614,LastTimestamp:2026-03-13 10:33:31.846059226 +0000 UTC m=+14.464175614,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.748027 master-0 kubenswrapper[3972]: E0313 10:33:33.747810 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c60188e27af7e kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:31.857579902 +0000 UTC m=+14.475696290,LastTimestamp:2026-03-13 10:33:31.857579902 +0000 UTC m=+14.475696290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.753563 master-0 kubenswrapper[3972]: E0313 10:33:33.753414 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6018ae283e8c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.394487436 +0000 UTC m=+15.012603824,LastTimestamp:2026-03-13 10:33:32.394487436 +0000 UTC m=+15.012603824,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.758495 master-0 kubenswrapper[3972]: E0313 10:33:33.758376 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6018bbe89061 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.625195105 +0000 UTC m=+15.243311493,LastTimestamp:2026-03-13 10:33:32.625195105 +0000 UTC m=+15.243311493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.762944 master-0 kubenswrapper[3972]: E0313 10:33:33.762864 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6018bee98295 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 3.123s (3.123s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.675588757 +0000 UTC m=+15.293705165,LastTimestamp:2026-03-13 10:33:32.675588757 +0000 UTC m=+15.293705165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.767684 master-0 kubenswrapper[3972]: E0313 10:33:33.767578 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189c6017cc0ebbf2\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017cc0ebbf2 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.601164786 +0000 UTC m=+11.219281194,LastTimestamp:2026-03-13 10:33:32.83033655 +0000 UTC m=+15.448452938,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.772228 master-0 kubenswrapper[3972]: E0313 10:33:33.772130 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189c6017ccb348a8\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189c6017ccb348a8 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:28.611948712 +0000 UTC m=+11.230065100,LastTimestamp:2026-03-13 10:33:32.837254872 +0000 UTC m=+15.455371260,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.776080 master-0 kubenswrapper[3972]: E0313 10:33:33.775997 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6018c9d2a8a9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.858640553 +0000 UTC m=+15.476756941,LastTimestamp:2026-03-13 10:33:32.858640553 +0000 UTC m=+15.476756941,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.780186 master-0 kubenswrapper[3972]: E0313 10:33:33.780064 3972 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189c6018ca95219c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.8713855 +0000 UTC m=+15.489501888,LastTimestamp:2026-03-13 10:33:32.8713855 +0000 UTC m=+15.489501888,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:33.784736 master-0 kubenswrapper[3972]: E0313 10:33:33.784654 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c6018ae283e8c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6018ae283e8c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.394487436 +0000 UTC m=+15.012603824,LastTimestamp:2026-03-13 10:33:33.407069453 +0000 UTC m=+16.025185851,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:34.091366 master-0 kubenswrapper[3972]: I0313 10:33:34.091190 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:34.426320 master-0 kubenswrapper[3972]: I0313 10:33:34.426233 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:34.427514 master-0 kubenswrapper[3972]: I0313 10:33:34.426406 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:34.427514 master-0 kubenswrapper[3972]: I0313 10:33:34.426234 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:34.427704 master-0 kubenswrapper[3972]: I0313 10:33:34.427645 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:34.427805 master-0 kubenswrapper[3972]: I0313 10:33:34.427707 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:34.427805 master-0 kubenswrapper[3972]: I0313 10:33:34.427737 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:34.429292 master-0 kubenswrapper[3972]: I0313 10:33:34.429233 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:34.429516 master-0 kubenswrapper[3972]: I0313 10:33:34.429305 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:34.429516 master-0 kubenswrapper[3972]: I0313 10:33:34.429332 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:34.435828 master-0 kubenswrapper[3972]: I0313 10:33:34.435772 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:33:35.091271 master-0 kubenswrapper[3972]: I0313 10:33:35.091168 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:35.336609 master-0 kubenswrapper[3972]: W0313 10:33:35.336524 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 10:33:35.336609 master-0 kubenswrapper[3972]: E0313 10:33:35.336614 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:35.428968 master-0 kubenswrapper[3972]: I0313 10:33:35.428863 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:35.430433 master-0 kubenswrapper[3972]: I0313 10:33:35.430376 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:35.430508 master-0 kubenswrapper[3972]: I0313 10:33:35.430436 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:35.430508 master-0 kubenswrapper[3972]: I0313 10:33:35.430457 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:36.088061 master-0 kubenswrapper[3972]: I0313 10:33:36.088011 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:36.430738 master-0 kubenswrapper[3972]: I0313 10:33:36.430648 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:36.431820 master-0 kubenswrapper[3972]: I0313 10:33:36.431749 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:36.431920 master-0 kubenswrapper[3972]: I0313 10:33:36.431872 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:36.431920 master-0 kubenswrapper[3972]: I0313 10:33:36.431902 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:37.091076 master-0 kubenswrapper[3972]: I0313 10:33:37.090957 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:37.636748 master-0 kubenswrapper[3972]: W0313 10:33:37.636625 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 10:33:37.636748 master-0 kubenswrapper[3972]: E0313 10:33:37.636722 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:38.089393 master-0 kubenswrapper[3972]: I0313 10:33:38.089157 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:38.191899 master-0 kubenswrapper[3972]: I0313 10:33:38.191763 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:38.193647 master-0 kubenswrapper[3972]: I0313 10:33:38.193591 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:38.193753 master-0 kubenswrapper[3972]: I0313 10:33:38.193654 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:38.193753 master-0 kubenswrapper[3972]: I0313 10:33:38.193673 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:38.193887 master-0 kubenswrapper[3972]: I0313 10:33:38.193752 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:38.198714 master-0 kubenswrapper[3972]: E0313 10:33:38.198647 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 10:33:38.199214 master-0 kubenswrapper[3972]: E0313 10:33:38.199154 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 10:33:38.228284 master-0 kubenswrapper[3972]: E0313 10:33:38.228183 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:33:38.652705 master-0 kubenswrapper[3972]: I0313 10:33:38.652575 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:38.653547 master-0 kubenswrapper[3972]: I0313 10:33:38.652778 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:38.654132 master-0 kubenswrapper[3972]: I0313 10:33:38.654058 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:38.654132 master-0 kubenswrapper[3972]: I0313 10:33:38.654127 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:38.654266 master-0 kubenswrapper[3972]: I0313 10:33:38.654144 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:38.966203 master-0 kubenswrapper[3972]: I0313 10:33:38.965986 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:39.088965 master-0 kubenswrapper[3972]: I0313 10:33:39.088901 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:39.438595 master-0 kubenswrapper[3972]: I0313 10:33:39.438478 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:39.439417 master-0 kubenswrapper[3972]: I0313 10:33:39.439350 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:39.439417 master-0 kubenswrapper[3972]: I0313 10:33:39.439413 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:39.439648 master-0 kubenswrapper[3972]: I0313 10:33:39.439433 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:40.091020 master-0 kubenswrapper[3972]: I0313 10:33:40.090920 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:40.776029 master-0 kubenswrapper[3972]: I0313 10:33:40.775906 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:40.776323 master-0 kubenswrapper[3972]: I0313 10:33:40.776135 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:40.777228 master-0 kubenswrapper[3972]: I0313 10:33:40.777189 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:40.777228 master-0 kubenswrapper[3972]: I0313 10:33:40.777227 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:40.777320 master-0 kubenswrapper[3972]: I0313 10:33:40.777237 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:40.780142 master-0 kubenswrapper[3972]: I0313 10:33:40.780092 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:41.083618 master-0 kubenswrapper[3972]: W0313 10:33:41.083302 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 13 10:33:41.083618 master-0 kubenswrapper[3972]: E0313 10:33:41.083395 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:41.088794 master-0 kubenswrapper[3972]: I0313 10:33:41.088737 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:41.443150 master-0 kubenswrapper[3972]: I0313 10:33:41.443082 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:41.444014 master-0 kubenswrapper[3972]: I0313 10:33:41.443962 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:41.444014 master-0 kubenswrapper[3972]: I0313 10:33:41.444009 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:41.444014 master-0 kubenswrapper[3972]: I0313 10:33:41.444018 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:41.448431 master-0 kubenswrapper[3972]: I0313 10:33:41.448361 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:41.853556 master-0 kubenswrapper[3972]: I0313 10:33:41.853334 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:41.858247 master-0 kubenswrapper[3972]: I0313 10:33:41.858205 3972 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:42.091304 master-0 kubenswrapper[3972]: I0313 10:33:42.091244 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:42.445961 master-0 kubenswrapper[3972]: I0313 10:33:42.445771 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:42.447206 master-0 kubenswrapper[3972]: I0313 10:33:42.447139 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:42.447285 master-0 kubenswrapper[3972]: I0313 10:33:42.447224 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:42.447285 master-0 kubenswrapper[3972]: I0313 10:33:42.447257 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:42.453868 master-0 kubenswrapper[3972]: I0313 10:33:42.453808 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:33:42.845547 master-0 kubenswrapper[3972]: I0313 10:33:42.845327 3972 csr.go:261] certificate signing request csr-4w22w is approved, waiting to be issued Mar 13 10:33:43.088199 master-0 kubenswrapper[3972]: I0313 10:33:43.088115 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:43.447865 master-0 kubenswrapper[3972]: I0313 10:33:43.447767 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:43.448791 master-0 kubenswrapper[3972]: I0313 10:33:43.448541 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:43.448791 master-0 kubenswrapper[3972]: I0313 10:33:43.448571 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:43.448791 master-0 kubenswrapper[3972]: I0313 10:33:43.448582 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:44.093599 master-0 kubenswrapper[3972]: I0313 10:33:44.093537 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:44.450461 master-0 kubenswrapper[3972]: I0313 10:33:44.450373 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:44.451284 master-0 kubenswrapper[3972]: I0313 10:33:44.451204 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:44.451284 master-0 kubenswrapper[3972]: I0313 10:33:44.451245 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:44.451284 master-0 kubenswrapper[3972]: I0313 10:33:44.451260 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:45.090773 master-0 kubenswrapper[3972]: I0313 10:33:45.090719 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:45.199229 master-0 kubenswrapper[3972]: I0313 10:33:45.199149 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:45.200519 master-0 kubenswrapper[3972]: I0313 10:33:45.200470 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:45.200519 master-0 kubenswrapper[3972]: I0313 10:33:45.200518 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:45.200620 master-0 kubenswrapper[3972]: I0313 10:33:45.200530 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:45.200955 master-0 kubenswrapper[3972]: I0313 10:33:45.200612 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:45.207752 master-0 kubenswrapper[3972]: E0313 10:33:45.207694 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 10:33:45.208403 master-0 kubenswrapper[3972]: E0313 10:33:45.208358 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 10:33:46.089700 master-0 kubenswrapper[3972]: I0313 10:33:46.089543 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:46.309764 master-0 kubenswrapper[3972]: I0313 10:33:46.309663 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:46.310865 master-0 kubenswrapper[3972]: I0313 10:33:46.310818 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:46.310986 master-0 kubenswrapper[3972]: I0313 10:33:46.310871 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:46.310986 master-0 kubenswrapper[3972]: I0313 10:33:46.310910 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:46.311506 master-0 kubenswrapper[3972]: I0313 10:33:46.311468 3972 scope.go:117] "RemoveContainer" containerID="4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28" Mar 13 10:33:46.323831 master-0 kubenswrapper[3972]: E0313 10:33:46.323685 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c6017f93245ca\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6017f93245ca openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.358468554 +0000 UTC m=+11.976584972,LastTimestamp:2026-03-13 10:33:46.316015235 +0000 UTC m=+28.934131613,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:46.512423 master-0 kubenswrapper[3972]: E0313 10:33:46.512271 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c601804439b50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804439b50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.544153936 +0000 UTC m=+12.162270324,LastTimestamp:2026-03-13 10:33:46.505443189 +0000 UTC m=+29.123559617,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:46.531021 master-0 kubenswrapper[3972]: E0313 10:33:46.530851 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c601804d62cab\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c601804d62cab openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:29.553759403 +0000 UTC m=+12.171875801,LastTimestamp:2026-03-13 10:33:46.524836831 +0000 UTC m=+29.142953219,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:47.089450 master-0 kubenswrapper[3972]: I0313 10:33:47.089397 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:47.458845 master-0 kubenswrapper[3972]: I0313 10:33:47.458803 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 10:33:47.459662 master-0 kubenswrapper[3972]: I0313 10:33:47.459613 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 13 10:33:47.460124 master-0 kubenswrapper[3972]: I0313 10:33:47.460041 3972 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" exitCode=1 Mar 13 10:33:47.460188 master-0 kubenswrapper[3972]: I0313 10:33:47.460120 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3"} Mar 13 10:33:47.460188 master-0 kubenswrapper[3972]: I0313 10:33:47.460175 3972 scope.go:117] "RemoveContainer" containerID="4d6c08d9757fbf834db9956d15b1fdd3599fd24c858a3ddd4ff4c6d980bb6d28" Mar 13 10:33:47.460350 master-0 kubenswrapper[3972]: I0313 10:33:47.460316 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:47.461548 master-0 kubenswrapper[3972]: I0313 10:33:47.461468 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:47.461548 master-0 kubenswrapper[3972]: I0313 10:33:47.461503 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:47.461548 master-0 kubenswrapper[3972]: I0313 10:33:47.461514 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:47.461911 master-0 kubenswrapper[3972]: I0313 10:33:47.461876 3972 scope.go:117] "RemoveContainer" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" Mar 13 10:33:47.462082 master-0 kubenswrapper[3972]: E0313 10:33:47.462040 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 10:33:47.467525 master-0 kubenswrapper[3972]: E0313 10:33:47.467286 3972 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189c6018ae283e8c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189c6018ae283e8c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:33:32.394487436 +0000 UTC m=+15.012603824,LastTimestamp:2026-03-13 10:33:47.462009835 +0000 UTC m=+30.080126223,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:33:48.088058 master-0 kubenswrapper[3972]: I0313 10:33:48.087937 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:48.228777 master-0 kubenswrapper[3972]: E0313 10:33:48.228673 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:33:48.465324 master-0 kubenswrapper[3972]: I0313 10:33:48.465199 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 10:33:49.089001 master-0 kubenswrapper[3972]: I0313 10:33:49.088908 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:50.089460 master-0 kubenswrapper[3972]: I0313 10:33:50.089360 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:50.894199 master-0 kubenswrapper[3972]: W0313 10:33:50.894121 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:50.894687 master-0 kubenswrapper[3972]: E0313 10:33:50.894299 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:50.899215 master-0 kubenswrapper[3972]: W0313 10:33:50.899159 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 13 10:33:50.899299 master-0 kubenswrapper[3972]: E0313 10:33:50.899219 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:51.090112 master-0 kubenswrapper[3972]: I0313 10:33:51.089969 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:52.194427 master-0 kubenswrapper[3972]: I0313 10:33:52.194253 3972 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 13 10:33:52.208708 master-0 kubenswrapper[3972]: I0313 10:33:52.208623 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:52.210836 master-0 kubenswrapper[3972]: I0313 10:33:52.210801 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:52.210933 master-0 kubenswrapper[3972]: I0313 10:33:52.210870 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:52.210933 master-0 kubenswrapper[3972]: I0313 10:33:52.210879 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:52.211047 master-0 kubenswrapper[3972]: I0313 10:33:52.210957 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:52.212986 master-0 kubenswrapper[3972]: E0313 10:33:52.212957 3972 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 13 10:33:52.213062 master-0 kubenswrapper[3972]: E0313 10:33:52.212958 3972 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 13 10:33:52.241261 master-0 kubenswrapper[3972]: W0313 10:33:52.241171 3972 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 13 10:33:52.241480 master-0 kubenswrapper[3972]: E0313 10:33:52.241260 3972 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 13 10:33:52.426134 master-0 kubenswrapper[3972]: I0313 10:33:52.426015 3972 csr.go:257] certificate signing request csr-4w22w is issued Mar 13 10:33:53.066842 master-0 kubenswrapper[3972]: I0313 10:33:53.066690 3972 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 13 10:33:53.103193 master-0 kubenswrapper[3972]: I0313 10:33:53.102387 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.130124 master-0 kubenswrapper[3972]: I0313 10:33:53.129495 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.188265 master-0 kubenswrapper[3972]: I0313 10:33:53.188208 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.429732 master-0 kubenswrapper[3972]: I0313 10:33:53.429589 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 06:44:37.945969605 +0000 UTC Mar 13 10:33:53.429732 master-0 kubenswrapper[3972]: I0313 10:33:53.429703 3972 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h10m44.516270798s for next certificate rotation Mar 13 10:33:53.451232 master-0 kubenswrapper[3972]: I0313 10:33:53.451131 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.451232 master-0 kubenswrapper[3972]: E0313 10:33:53.451199 3972 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 10:33:53.474434 master-0 kubenswrapper[3972]: I0313 10:33:53.474379 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.490599 master-0 kubenswrapper[3972]: I0313 10:33:53.490544 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.554190 master-0 kubenswrapper[3972]: I0313 10:33:53.552059 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.808788 master-0 kubenswrapper[3972]: I0313 10:33:53.808644 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.808788 master-0 kubenswrapper[3972]: E0313 10:33:53.808690 3972 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 10:33:53.908929 master-0 kubenswrapper[3972]: I0313 10:33:53.908840 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.924498 master-0 kubenswrapper[3972]: I0313 10:33:53.924419 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:53.981807 master-0 kubenswrapper[3972]: I0313 10:33:53.981686 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:54.236814 master-0 kubenswrapper[3972]: I0313 10:33:54.236723 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:54.236814 master-0 kubenswrapper[3972]: E0313 10:33:54.236798 3972 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 10:33:54.826470 master-0 kubenswrapper[3972]: I0313 10:33:54.826378 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:54.841989 master-0 kubenswrapper[3972]: I0313 10:33:54.841906 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:54.902087 master-0 kubenswrapper[3972]: I0313 10:33:54.901996 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:55.168002 master-0 kubenswrapper[3972]: I0313 10:33:55.167921 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:55.168002 master-0 kubenswrapper[3972]: E0313 10:33:55.167981 3972 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 10:33:58.229272 master-0 kubenswrapper[3972]: E0313 10:33:58.229153 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:33:58.664271 master-0 kubenswrapper[3972]: I0313 10:33:58.664066 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:58.679559 master-0 kubenswrapper[3972]: I0313 10:33:58.679510 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:58.737005 master-0 kubenswrapper[3972]: I0313 10:33:58.736932 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:59.008424 master-0 kubenswrapper[3972]: I0313 10:33:59.008260 3972 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 13 10:33:59.008424 master-0 kubenswrapper[3972]: E0313 10:33:59.008306 3972 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 13 10:33:59.213730 master-0 kubenswrapper[3972]: I0313 10:33:59.213634 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:59.215451 master-0 kubenswrapper[3972]: I0313 10:33:59.215334 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:59.215451 master-0 kubenswrapper[3972]: I0313 10:33:59.215402 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:59.215451 master-0 kubenswrapper[3972]: I0313 10:33:59.215421 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:59.215451 master-0 kubenswrapper[3972]: I0313 10:33:59.215490 3972 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:33:59.218728 master-0 kubenswrapper[3972]: E0313 10:33:59.218689 3972 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 13 10:33:59.224944 master-0 kubenswrapper[3972]: I0313 10:33:59.224903 3972 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 10:33:59.225049 master-0 kubenswrapper[3972]: E0313 10:33:59.224947 3972 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 13 10:33:59.234013 master-0 kubenswrapper[3972]: E0313 10:33:59.233958 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.309554 master-0 kubenswrapper[3972]: I0313 10:33:59.309355 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:33:59.310744 master-0 kubenswrapper[3972]: I0313 10:33:59.310692 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:33:59.310744 master-0 kubenswrapper[3972]: I0313 10:33:59.310731 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:33:59.310744 master-0 kubenswrapper[3972]: I0313 10:33:59.310746 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:33:59.311369 master-0 kubenswrapper[3972]: I0313 10:33:59.311323 3972 scope.go:117] "RemoveContainer" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" Mar 13 10:33:59.311669 master-0 kubenswrapper[3972]: E0313 10:33:59.311602 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 13 10:33:59.334450 master-0 kubenswrapper[3972]: E0313 10:33:59.334321 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.435125 master-0 kubenswrapper[3972]: E0313 10:33:59.435041 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.535722 master-0 kubenswrapper[3972]: E0313 10:33:59.535615 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.636539 master-0 kubenswrapper[3972]: E0313 10:33:59.636505 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.737616 master-0 kubenswrapper[3972]: E0313 10:33:59.737500 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.838436 master-0 kubenswrapper[3972]: E0313 10:33:59.837924 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:33:59.939075 master-0 kubenswrapper[3972]: E0313 10:33:59.938861 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.040039 master-0 kubenswrapper[3972]: E0313 10:34:00.039973 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.140438 master-0 kubenswrapper[3972]: E0313 10:34:00.140323 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.194421 master-0 kubenswrapper[3972]: I0313 10:34:00.194146 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 13 10:34:00.206266 master-0 kubenswrapper[3972]: I0313 10:34:00.206148 3972 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 13 10:34:00.241611 master-0 kubenswrapper[3972]: E0313 10:34:00.241535 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.342898 master-0 kubenswrapper[3972]: E0313 10:34:00.342767 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.443618 master-0 kubenswrapper[3972]: E0313 10:34:00.443545 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.544469 master-0 kubenswrapper[3972]: E0313 10:34:00.544254 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.644726 master-0 kubenswrapper[3972]: E0313 10:34:00.644650 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.745569 master-0 kubenswrapper[3972]: E0313 10:34:00.745492 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.846530 master-0 kubenswrapper[3972]: E0313 10:34:00.846372 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:00.947028 master-0 kubenswrapper[3972]: E0313 10:34:00.946963 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.047919 master-0 kubenswrapper[3972]: E0313 10:34:01.047841 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.158507 master-0 kubenswrapper[3972]: E0313 10:34:01.148458 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.249520 master-0 kubenswrapper[3972]: E0313 10:34:01.249384 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.350125 master-0 kubenswrapper[3972]: E0313 10:34:01.350039 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.451321 master-0 kubenswrapper[3972]: E0313 10:34:01.451122 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.552432 master-0 kubenswrapper[3972]: E0313 10:34:01.552323 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.653502 master-0 kubenswrapper[3972]: E0313 10:34:01.653368 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.754302 master-0 kubenswrapper[3972]: E0313 10:34:01.754177 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.855368 master-0 kubenswrapper[3972]: E0313 10:34:01.855278 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:01.956135 master-0 kubenswrapper[3972]: E0313 10:34:01.956012 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.056713 master-0 kubenswrapper[3972]: E0313 10:34:02.056467 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.157465 master-0 kubenswrapper[3972]: E0313 10:34:02.157365 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.257932 master-0 kubenswrapper[3972]: E0313 10:34:02.257861 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.358741 master-0 kubenswrapper[3972]: E0313 10:34:02.358573 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.459496 master-0 kubenswrapper[3972]: E0313 10:34:02.459414 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.560172 master-0 kubenswrapper[3972]: E0313 10:34:02.560047 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.661131 master-0 kubenswrapper[3972]: E0313 10:34:02.660999 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.761862 master-0 kubenswrapper[3972]: E0313 10:34:02.761749 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.863023 master-0 kubenswrapper[3972]: E0313 10:34:02.862934 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:02.963814 master-0 kubenswrapper[3972]: E0313 10:34:02.963597 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.063893 master-0 kubenswrapper[3972]: E0313 10:34:03.063777 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.164133 master-0 kubenswrapper[3972]: E0313 10:34:03.164001 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.264692 master-0 kubenswrapper[3972]: E0313 10:34:03.264472 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.364952 master-0 kubenswrapper[3972]: E0313 10:34:03.364857 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.465572 master-0 kubenswrapper[3972]: E0313 10:34:03.465484 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.566566 master-0 kubenswrapper[3972]: E0313 10:34:03.566491 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.667404 master-0 kubenswrapper[3972]: E0313 10:34:03.667334 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.783885 master-0 kubenswrapper[3972]: E0313 10:34:03.783726 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.884634 master-0 kubenswrapper[3972]: E0313 10:34:03.884541 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:03.889619 master-0 kubenswrapper[3972]: I0313 10:34:03.889561 3972 csr.go:261] certificate signing request csr-9hj8x is approved, waiting to be issued Mar 13 10:34:03.900025 master-0 kubenswrapper[3972]: I0313 10:34:03.899964 3972 csr.go:257] certificate signing request csr-9hj8x is issued Mar 13 10:34:03.985760 master-0 kubenswrapper[3972]: E0313 10:34:03.985676 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.086037 master-0 kubenswrapper[3972]: E0313 10:34:04.085824 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.186583 master-0 kubenswrapper[3972]: E0313 10:34:04.186501 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.287068 master-0 kubenswrapper[3972]: E0313 10:34:04.286974 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.388051 master-0 kubenswrapper[3972]: E0313 10:34:04.387964 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.488491 master-0 kubenswrapper[3972]: E0313 10:34:04.488388 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.589429 master-0 kubenswrapper[3972]: E0313 10:34:04.589345 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.690173 master-0 kubenswrapper[3972]: E0313 10:34:04.690030 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.790991 master-0 kubenswrapper[3972]: E0313 10:34:04.790905 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.891223 master-0 kubenswrapper[3972]: E0313 10:34:04.891072 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:04.901470 master-0 kubenswrapper[3972]: I0313 10:34:04.901406 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 06:19:47.122526026 +0000 UTC Mar 13 10:34:04.901470 master-0 kubenswrapper[3972]: I0313 10:34:04.901463 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h45m42.221069501s for next certificate rotation Mar 13 10:34:04.991783 master-0 kubenswrapper[3972]: E0313 10:34:04.991562 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.091893 master-0 kubenswrapper[3972]: E0313 10:34:05.091765 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.193132 master-0 kubenswrapper[3972]: E0313 10:34:05.192940 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.293551 master-0 kubenswrapper[3972]: E0313 10:34:05.293382 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.393899 master-0 kubenswrapper[3972]: E0313 10:34:05.393778 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.495192 master-0 kubenswrapper[3972]: E0313 10:34:05.495050 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.536358 master-0 kubenswrapper[3972]: I0313 10:34:05.536294 3972 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:34:05.596328 master-0 kubenswrapper[3972]: E0313 10:34:05.596170 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.696949 master-0 kubenswrapper[3972]: E0313 10:34:05.696875 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.797488 master-0 kubenswrapper[3972]: E0313 10:34:05.797429 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.898400 master-0 kubenswrapper[3972]: E0313 10:34:05.898313 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:05.902643 master-0 kubenswrapper[3972]: I0313 10:34:05.902580 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 04:56:00.102838164 +0000 UTC Mar 13 10:34:05.902643 master-0 kubenswrapper[3972]: I0313 10:34:05.902620 3972 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h21m54.200220686s for next certificate rotation Mar 13 10:34:05.999556 master-0 kubenswrapper[3972]: E0313 10:34:05.999491 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.100128 master-0 kubenswrapper[3972]: E0313 10:34:06.100055 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.200816 master-0 kubenswrapper[3972]: E0313 10:34:06.200691 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.301949 master-0 kubenswrapper[3972]: E0313 10:34:06.301864 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.402785 master-0 kubenswrapper[3972]: E0313 10:34:06.402719 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.503901 master-0 kubenswrapper[3972]: E0313 10:34:06.503769 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.605268 master-0 kubenswrapper[3972]: E0313 10:34:06.605181 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.705930 master-0 kubenswrapper[3972]: E0313 10:34:06.705842 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.807156 master-0 kubenswrapper[3972]: E0313 10:34:06.806888 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:06.907268 master-0 kubenswrapper[3972]: E0313 10:34:06.907066 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.008474 master-0 kubenswrapper[3972]: E0313 10:34:07.008374 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.109228 master-0 kubenswrapper[3972]: E0313 10:34:07.108973 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.210028 master-0 kubenswrapper[3972]: E0313 10:34:07.209927 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.592452 master-0 kubenswrapper[3972]: E0313 10:34:07.592275 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.693510 master-0 kubenswrapper[3972]: E0313 10:34:07.693316 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.794283 master-0 kubenswrapper[3972]: E0313 10:34:07.794062 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.894710 master-0 kubenswrapper[3972]: E0313 10:34:07.894646 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:07.995757 master-0 kubenswrapper[3972]: E0313 10:34:07.995652 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.095871 master-0 kubenswrapper[3972]: E0313 10:34:08.095784 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.196855 master-0 kubenswrapper[3972]: E0313 10:34:08.196627 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.230548 master-0 kubenswrapper[3972]: E0313 10:34:08.230373 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:34:08.297047 master-0 kubenswrapper[3972]: E0313 10:34:08.296945 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.398011 master-0 kubenswrapper[3972]: E0313 10:34:08.397896 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.498980 master-0 kubenswrapper[3972]: E0313 10:34:08.498749 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.599652 master-0 kubenswrapper[3972]: E0313 10:34:08.599533 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.700672 master-0 kubenswrapper[3972]: E0313 10:34:08.700568 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.801797 master-0 kubenswrapper[3972]: E0313 10:34:08.801512 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:08.902646 master-0 kubenswrapper[3972]: E0313 10:34:08.902494 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.003376 master-0 kubenswrapper[3972]: E0313 10:34:09.003219 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.104235 master-0 kubenswrapper[3972]: E0313 10:34:09.104005 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.204360 master-0 kubenswrapper[3972]: E0313 10:34:09.204223 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.304501 master-0 kubenswrapper[3972]: E0313 10:34:09.304415 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.347041 master-0 kubenswrapper[3972]: E0313 10:34:09.346888 3972 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 13 10:34:09.404618 master-0 kubenswrapper[3972]: E0313 10:34:09.404530 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.505520 master-0 kubenswrapper[3972]: E0313 10:34:09.505421 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.606283 master-0 kubenswrapper[3972]: E0313 10:34:09.606189 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.707252 master-0 kubenswrapper[3972]: E0313 10:34:09.706966 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.807984 master-0 kubenswrapper[3972]: E0313 10:34:09.807733 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:09.909038 master-0 kubenswrapper[3972]: E0313 10:34:09.908959 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.009536 master-0 kubenswrapper[3972]: E0313 10:34:10.009297 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.109746 master-0 kubenswrapper[3972]: E0313 10:34:10.109641 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.209914 master-0 kubenswrapper[3972]: E0313 10:34:10.209790 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.310431 master-0 kubenswrapper[3972]: E0313 10:34:10.310207 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.411036 master-0 kubenswrapper[3972]: E0313 10:34:10.410883 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.512029 master-0 kubenswrapper[3972]: E0313 10:34:10.511915 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.613148 master-0 kubenswrapper[3972]: E0313 10:34:10.612634 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.713491 master-0 kubenswrapper[3972]: E0313 10:34:10.713384 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.814222 master-0 kubenswrapper[3972]: E0313 10:34:10.814065 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:10.915157 master-0 kubenswrapper[3972]: E0313 10:34:10.915026 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.016140 master-0 kubenswrapper[3972]: E0313 10:34:11.016025 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.116621 master-0 kubenswrapper[3972]: E0313 10:34:11.116497 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.217739 master-0 kubenswrapper[3972]: E0313 10:34:11.217583 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.309190 master-0 kubenswrapper[3972]: I0313 10:34:11.309071 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:34:11.311047 master-0 kubenswrapper[3972]: I0313 10:34:11.310981 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:34:11.311278 master-0 kubenswrapper[3972]: I0313 10:34:11.311066 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:34:11.311278 master-0 kubenswrapper[3972]: I0313 10:34:11.311175 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:34:11.311948 master-0 kubenswrapper[3972]: I0313 10:34:11.311891 3972 scope.go:117] "RemoveContainer" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" Mar 13 10:34:11.317711 master-0 kubenswrapper[3972]: E0313 10:34:11.317654 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.418609 master-0 kubenswrapper[3972]: E0313 10:34:11.418410 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.519744 master-0 kubenswrapper[3972]: E0313 10:34:11.519585 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.620270 master-0 kubenswrapper[3972]: E0313 10:34:11.620212 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.721446 master-0 kubenswrapper[3972]: E0313 10:34:11.721369 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.822706 master-0 kubenswrapper[3972]: E0313 10:34:11.822379 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:11.923137 master-0 kubenswrapper[3972]: E0313 10:34:11.923012 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.024132 master-0 kubenswrapper[3972]: E0313 10:34:12.024022 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.125239 master-0 kubenswrapper[3972]: E0313 10:34:12.125155 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.225625 master-0 kubenswrapper[3972]: E0313 10:34:12.225526 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.325924 master-0 kubenswrapper[3972]: E0313 10:34:12.325826 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.426808 master-0 kubenswrapper[3972]: E0313 10:34:12.426651 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.527149 master-0 kubenswrapper[3972]: E0313 10:34:12.527021 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.611896 master-0 kubenswrapper[3972]: I0313 10:34:12.611819 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 10:34:12.612930 master-0 kubenswrapper[3972]: I0313 10:34:12.612871 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8d5872d3df5ae3d0356feb1227762765a592eb87fd4344b9e636b3a3e963fad0"} Mar 13 10:34:12.613026 master-0 kubenswrapper[3972]: I0313 10:34:12.613007 3972 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:34:12.614335 master-0 kubenswrapper[3972]: I0313 10:34:12.614131 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:34:12.614335 master-0 kubenswrapper[3972]: I0313 10:34:12.614157 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:34:12.614335 master-0 kubenswrapper[3972]: I0313 10:34:12.614169 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:34:12.627838 master-0 kubenswrapper[3972]: E0313 10:34:12.627776 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.729008 master-0 kubenswrapper[3972]: E0313 10:34:12.728829 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.829373 master-0 kubenswrapper[3972]: E0313 10:34:12.829294 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:12.930165 master-0 kubenswrapper[3972]: E0313 10:34:12.930077 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.030789 master-0 kubenswrapper[3972]: E0313 10:34:13.030631 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.131068 master-0 kubenswrapper[3972]: E0313 10:34:13.130992 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.232193 master-0 kubenswrapper[3972]: E0313 10:34:13.232079 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.333310 master-0 kubenswrapper[3972]: E0313 10:34:13.333161 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.434029 master-0 kubenswrapper[3972]: E0313 10:34:13.433952 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.534514 master-0 kubenswrapper[3972]: E0313 10:34:13.534406 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.635447 master-0 kubenswrapper[3972]: E0313 10:34:13.635371 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.736656 master-0 kubenswrapper[3972]: E0313 10:34:13.736544 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.837414 master-0 kubenswrapper[3972]: E0313 10:34:13.837259 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:13.937956 master-0 kubenswrapper[3972]: E0313 10:34:13.937747 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.038787 master-0 kubenswrapper[3972]: E0313 10:34:14.038696 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.139940 master-0 kubenswrapper[3972]: E0313 10:34:14.139829 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.240392 master-0 kubenswrapper[3972]: E0313 10:34:14.240238 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.341257 master-0 kubenswrapper[3972]: E0313 10:34:14.341160 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.441727 master-0 kubenswrapper[3972]: E0313 10:34:14.441578 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.542020 master-0 kubenswrapper[3972]: E0313 10:34:14.541835 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.642809 master-0 kubenswrapper[3972]: E0313 10:34:14.642737 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.743848 master-0 kubenswrapper[3972]: E0313 10:34:14.743677 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.844866 master-0 kubenswrapper[3972]: E0313 10:34:14.844613 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:14.945121 master-0 kubenswrapper[3972]: E0313 10:34:14.944892 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.045473 master-0 kubenswrapper[3972]: E0313 10:34:15.045374 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.145925 master-0 kubenswrapper[3972]: E0313 10:34:15.145824 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.246151 master-0 kubenswrapper[3972]: E0313 10:34:15.246011 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.346743 master-0 kubenswrapper[3972]: E0313 10:34:15.346590 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.447640 master-0 kubenswrapper[3972]: E0313 10:34:15.447417 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.547789 master-0 kubenswrapper[3972]: E0313 10:34:15.547648 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.648843 master-0 kubenswrapper[3972]: E0313 10:34:15.648741 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.749688 master-0 kubenswrapper[3972]: E0313 10:34:15.749421 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.850636 master-0 kubenswrapper[3972]: E0313 10:34:15.850508 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:15.951540 master-0 kubenswrapper[3972]: E0313 10:34:15.951380 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.051871 master-0 kubenswrapper[3972]: E0313 10:34:16.051573 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.152313 master-0 kubenswrapper[3972]: E0313 10:34:16.152197 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.252727 master-0 kubenswrapper[3972]: E0313 10:34:16.252597 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.353663 master-0 kubenswrapper[3972]: E0313 10:34:16.353478 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.454445 master-0 kubenswrapper[3972]: E0313 10:34:16.454329 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.555047 master-0 kubenswrapper[3972]: E0313 10:34:16.554951 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.655851 master-0 kubenswrapper[3972]: E0313 10:34:16.655735 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.756130 master-0 kubenswrapper[3972]: E0313 10:34:16.756012 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.856958 master-0 kubenswrapper[3972]: E0313 10:34:16.856835 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:16.958081 master-0 kubenswrapper[3972]: E0313 10:34:16.957900 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.059130 master-0 kubenswrapper[3972]: E0313 10:34:17.058971 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.159356 master-0 kubenswrapper[3972]: E0313 10:34:17.159250 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.259963 master-0 kubenswrapper[3972]: E0313 10:34:17.259767 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.360159 master-0 kubenswrapper[3972]: E0313 10:34:17.360048 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.461189 master-0 kubenswrapper[3972]: E0313 10:34:17.461067 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.562435 master-0 kubenswrapper[3972]: E0313 10:34:17.562257 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.663049 master-0 kubenswrapper[3972]: E0313 10:34:17.662965 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.763701 master-0 kubenswrapper[3972]: E0313 10:34:17.763607 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.864642 master-0 kubenswrapper[3972]: E0313 10:34:17.864478 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:17.965190 master-0 kubenswrapper[3972]: E0313 10:34:17.965088 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.065763 master-0 kubenswrapper[3972]: E0313 10:34:18.065665 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.166874 master-0 kubenswrapper[3972]: E0313 10:34:18.166795 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.231039 master-0 kubenswrapper[3972]: E0313 10:34:18.230923 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:34:18.267329 master-0 kubenswrapper[3972]: E0313 10:34:18.267228 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.368380 master-0 kubenswrapper[3972]: E0313 10:34:18.368240 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.469306 master-0 kubenswrapper[3972]: E0313 10:34:18.469069 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.569777 master-0 kubenswrapper[3972]: E0313 10:34:18.569665 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.670401 master-0 kubenswrapper[3972]: E0313 10:34:18.670326 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.771495 master-0 kubenswrapper[3972]: E0313 10:34:18.771312 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.872254 master-0 kubenswrapper[3972]: E0313 10:34:18.872155 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:18.972991 master-0 kubenswrapper[3972]: E0313 10:34:18.972860 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.073942 master-0 kubenswrapper[3972]: E0313 10:34:19.073769 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.174822 master-0 kubenswrapper[3972]: E0313 10:34:19.174743 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.275463 master-0 kubenswrapper[3972]: E0313 10:34:19.275376 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.375874 master-0 kubenswrapper[3972]: E0313 10:34:19.375804 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.476255 master-0 kubenswrapper[3972]: E0313 10:34:19.476136 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.577137 master-0 kubenswrapper[3972]: E0313 10:34:19.577053 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.654178 master-0 kubenswrapper[3972]: E0313 10:34:19.653920 3972 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 13 10:34:19.783341 master-0 kubenswrapper[3972]: E0313 10:34:19.783293 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.884380 master-0 kubenswrapper[3972]: E0313 10:34:19.884235 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:19.984658 master-0 kubenswrapper[3972]: E0313 10:34:19.984400 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.084685 master-0 kubenswrapper[3972]: E0313 10:34:20.084541 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.185659 master-0 kubenswrapper[3972]: E0313 10:34:20.185576 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.285901 master-0 kubenswrapper[3972]: E0313 10:34:20.285749 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.386289 master-0 kubenswrapper[3972]: E0313 10:34:20.386195 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.486574 master-0 kubenswrapper[3972]: E0313 10:34:20.486462 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.587903 master-0 kubenswrapper[3972]: E0313 10:34:20.587692 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.688645 master-0 kubenswrapper[3972]: E0313 10:34:20.688531 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.789761 master-0 kubenswrapper[3972]: E0313 10:34:20.789636 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.890598 master-0 kubenswrapper[3972]: E0313 10:34:20.890501 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:20.991348 master-0 kubenswrapper[3972]: E0313 10:34:20.991227 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.091774 master-0 kubenswrapper[3972]: E0313 10:34:21.091647 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.192858 master-0 kubenswrapper[3972]: E0313 10:34:21.192634 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.293773 master-0 kubenswrapper[3972]: E0313 10:34:21.293656 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.394305 master-0 kubenswrapper[3972]: E0313 10:34:21.394193 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.495657 master-0 kubenswrapper[3972]: E0313 10:34:21.495418 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.596748 master-0 kubenswrapper[3972]: E0313 10:34:21.596616 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.696969 master-0 kubenswrapper[3972]: E0313 10:34:21.696841 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.797387 master-0 kubenswrapper[3972]: E0313 10:34:21.797223 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.898490 master-0 kubenswrapper[3972]: E0313 10:34:21.898396 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:21.999079 master-0 kubenswrapper[3972]: E0313 10:34:21.998978 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.099371 master-0 kubenswrapper[3972]: E0313 10:34:22.099155 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.199827 master-0 kubenswrapper[3972]: E0313 10:34:22.199684 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.300578 master-0 kubenswrapper[3972]: E0313 10:34:22.300426 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.389934 master-0 kubenswrapper[3972]: I0313 10:34:22.389787 3972 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:34:22.400800 master-0 kubenswrapper[3972]: E0313 10:34:22.400715 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.501297 master-0 kubenswrapper[3972]: E0313 10:34:22.501179 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.601665 master-0 kubenswrapper[3972]: E0313 10:34:22.601554 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.701954 master-0 kubenswrapper[3972]: E0313 10:34:22.701685 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.802677 master-0 kubenswrapper[3972]: E0313 10:34:22.802537 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:22.903780 master-0 kubenswrapper[3972]: E0313 10:34:22.903585 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.004655 master-0 kubenswrapper[3972]: E0313 10:34:23.004422 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.105627 master-0 kubenswrapper[3972]: E0313 10:34:23.105474 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.206042 master-0 kubenswrapper[3972]: E0313 10:34:23.205891 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.306623 master-0 kubenswrapper[3972]: E0313 10:34:23.306406 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.407339 master-0 kubenswrapper[3972]: E0313 10:34:23.407232 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.507652 master-0 kubenswrapper[3972]: E0313 10:34:23.507477 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.608243 master-0 kubenswrapper[3972]: E0313 10:34:23.607987 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.709003 master-0 kubenswrapper[3972]: E0313 10:34:23.708863 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.810148 master-0 kubenswrapper[3972]: E0313 10:34:23.810050 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:23.910829 master-0 kubenswrapper[3972]: E0313 10:34:23.910668 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.011400 master-0 kubenswrapper[3972]: E0313 10:34:24.011235 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.111867 master-0 kubenswrapper[3972]: E0313 10:34:24.111705 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.212244 master-0 kubenswrapper[3972]: E0313 10:34:24.212015 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.313003 master-0 kubenswrapper[3972]: E0313 10:34:24.312930 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.413817 master-0 kubenswrapper[3972]: E0313 10:34:24.413751 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.514204 master-0 kubenswrapper[3972]: E0313 10:34:24.513974 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.615274 master-0 kubenswrapper[3972]: E0313 10:34:24.615188 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.716063 master-0 kubenswrapper[3972]: E0313 10:34:24.715988 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.816637 master-0 kubenswrapper[3972]: E0313 10:34:24.816469 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:24.916683 master-0 kubenswrapper[3972]: E0313 10:34:24.916581 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.017025 master-0 kubenswrapper[3972]: E0313 10:34:25.016942 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.117607 master-0 kubenswrapper[3972]: E0313 10:34:25.117393 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.218484 master-0 kubenswrapper[3972]: E0313 10:34:25.218402 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.318879 master-0 kubenswrapper[3972]: E0313 10:34:25.318783 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.419568 master-0 kubenswrapper[3972]: E0313 10:34:25.419490 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.520743 master-0 kubenswrapper[3972]: E0313 10:34:25.520655 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.621902 master-0 kubenswrapper[3972]: E0313 10:34:25.621813 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.722860 master-0 kubenswrapper[3972]: E0313 10:34:25.722684 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.823838 master-0 kubenswrapper[3972]: E0313 10:34:25.823775 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:25.924909 master-0 kubenswrapper[3972]: E0313 10:34:25.924853 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.025336 master-0 kubenswrapper[3972]: E0313 10:34:26.025176 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.126293 master-0 kubenswrapper[3972]: E0313 10:34:26.126215 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.227175 master-0 kubenswrapper[3972]: E0313 10:34:26.227086 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.328435 master-0 kubenswrapper[3972]: E0313 10:34:26.328247 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.429188 master-0 kubenswrapper[3972]: E0313 10:34:26.429064 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.530273 master-0 kubenswrapper[3972]: E0313 10:34:26.530182 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.631287 master-0 kubenswrapper[3972]: E0313 10:34:26.631218 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.731755 master-0 kubenswrapper[3972]: E0313 10:34:26.731665 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.832815 master-0 kubenswrapper[3972]: E0313 10:34:26.832703 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:26.933944 master-0 kubenswrapper[3972]: E0313 10:34:26.933751 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.034241 master-0 kubenswrapper[3972]: E0313 10:34:27.034167 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.135032 master-0 kubenswrapper[3972]: E0313 10:34:27.134944 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.235316 master-0 kubenswrapper[3972]: E0313 10:34:27.235162 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.335625 master-0 kubenswrapper[3972]: E0313 10:34:27.335546 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.436816 master-0 kubenswrapper[3972]: E0313 10:34:27.436688 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.537339 master-0 kubenswrapper[3972]: E0313 10:34:27.537187 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.638477 master-0 kubenswrapper[3972]: E0313 10:34:27.638363 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.739292 master-0 kubenswrapper[3972]: E0313 10:34:27.739175 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.840145 master-0 kubenswrapper[3972]: E0313 10:34:27.839956 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:27.941294 master-0 kubenswrapper[3972]: E0313 10:34:27.941191 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.041914 master-0 kubenswrapper[3972]: E0313 10:34:28.041831 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.143064 master-0 kubenswrapper[3972]: E0313 10:34:28.142995 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.232241 master-0 kubenswrapper[3972]: E0313 10:34:28.232142 3972 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 13 10:34:28.243962 master-0 kubenswrapper[3972]: E0313 10:34:28.243917 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.344283 master-0 kubenswrapper[3972]: E0313 10:34:28.344085 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.445737 master-0 kubenswrapper[3972]: E0313 10:34:28.445214 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.546149 master-0 kubenswrapper[3972]: E0313 10:34:28.545965 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.646960 master-0 kubenswrapper[3972]: E0313 10:34:28.646829 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.747715 master-0 kubenswrapper[3972]: E0313 10:34:28.747550 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.848612 master-0 kubenswrapper[3972]: E0313 10:34:28.848495 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:28.949766 master-0 kubenswrapper[3972]: E0313 10:34:28.949656 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.050382 master-0 kubenswrapper[3972]: E0313 10:34:29.050173 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.150423 master-0 kubenswrapper[3972]: E0313 10:34:29.150342 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.250950 master-0 kubenswrapper[3972]: E0313 10:34:29.250830 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.352274 master-0 kubenswrapper[3972]: E0313 10:34:29.352056 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.452870 master-0 kubenswrapper[3972]: E0313 10:34:29.452768 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.553253 master-0 kubenswrapper[3972]: E0313 10:34:29.553168 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.653425 master-0 kubenswrapper[3972]: E0313 10:34:29.653320 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.754115 master-0 kubenswrapper[3972]: E0313 10:34:29.753998 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.855343 master-0 kubenswrapper[3972]: E0313 10:34:29.855235 3972 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:34:29.890990 master-0 kubenswrapper[3972]: I0313 10:34:29.890878 3972 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:34:30.213674 master-0 kubenswrapper[3972]: I0313 10:34:30.213609 3972 apiserver.go:52] "Watching apiserver" Mar 13 10:34:30.219440 master-0 kubenswrapper[3972]: I0313 10:34:30.219375 3972 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:34:30.219869 master-0 kubenswrapper[3972]: I0313 10:34:30.219802 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm","openshift-network-operator/network-operator-7c649bf6d4-z9wrg","assisted-installer/assisted-installer-controller-k96f8"] Mar 13 10:34:30.221612 master-0 kubenswrapper[3972]: I0313 10:34:30.221543 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.221774 master-0 kubenswrapper[3972]: I0313 10:34:30.221543 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.221900 master-0 kubenswrapper[3972]: I0313 10:34:30.221543 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.225531 master-0 kubenswrapper[3972]: I0313 10:34:30.225410 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.225950 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.225980 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.225970 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.226730 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.226816 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.226835 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.226830 3972 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 13 10:34:30.227641 master-0 kubenswrapper[3972]: I0313 10:34:30.226963 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 13 10:34:30.228692 master-0 kubenswrapper[3972]: I0313 10:34:30.227841 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:34:30.285125 master-0 kubenswrapper[3972]: I0313 10:34:30.285008 3972 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 10:34:30.339536 master-0 kubenswrapper[3972]: I0313 10:34:30.339483 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.339877 master-0 kubenswrapper[3972]: I0313 10:34:30.339851 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.339989 master-0 kubenswrapper[3972]: I0313 10:34:30.339972 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.340297 master-0 kubenswrapper[3972]: I0313 10:34:30.340260 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.340465 master-0 kubenswrapper[3972]: I0313 10:34:30.340442 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.340584 master-0 kubenswrapper[3972]: I0313 10:34:30.340566 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.340711 master-0 kubenswrapper[3972]: I0313 10:34:30.340688 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.340898 master-0 kubenswrapper[3972]: I0313 10:34:30.340875 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.341027 master-0 kubenswrapper[3972]: I0313 10:34:30.341010 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.341189 master-0 kubenswrapper[3972]: I0313 10:34:30.341163 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.341303 master-0 kubenswrapper[3972]: I0313 10:34:30.341284 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sr7xx\" (UniqueName: \"kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.341555 master-0 kubenswrapper[3972]: I0313 10:34:30.341439 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.341555 master-0 kubenswrapper[3972]: I0313 10:34:30.341476 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.442402 master-0 kubenswrapper[3972]: I0313 10:34:30.442277 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.442402 master-0 kubenswrapper[3972]: I0313 10:34:30.442382 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.442723 master-0 kubenswrapper[3972]: I0313 10:34:30.442675 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.442821 master-0 kubenswrapper[3972]: I0313 10:34:30.442748 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.442860 master-0 kubenswrapper[3972]: I0313 10:34:30.442844 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.442944 master-0 kubenswrapper[3972]: I0313 10:34:30.442892 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sr7xx\" (UniqueName: \"kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443033 master-0 kubenswrapper[3972]: I0313 10:34:30.442981 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.443167 master-0 kubenswrapper[3972]: I0313 10:34:30.443036 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443471 master-0 kubenswrapper[3972]: I0313 10:34:30.443366 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.443588 master-0 kubenswrapper[3972]: I0313 10:34:30.443532 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443716 master-0 kubenswrapper[3972]: I0313 10:34:30.443554 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.443716 master-0 kubenswrapper[3972]: I0313 10:34:30.443632 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.443820 master-0 kubenswrapper[3972]: I0313 10:34:30.443407 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443820 master-0 kubenswrapper[3972]: I0313 10:34:30.443722 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.443820 master-0 kubenswrapper[3972]: I0313 10:34:30.443751 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443961 master-0 kubenswrapper[3972]: I0313 10:34:30.443857 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.443961 master-0 kubenswrapper[3972]: I0313 10:34:30.443904 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.443961 master-0 kubenswrapper[3972]: I0313 10:34:30.443947 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.444246 master-0 kubenswrapper[3972]: I0313 10:34:30.443955 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.444246 master-0 kubenswrapper[3972]: I0313 10:34:30.444191 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.444246 master-0 kubenswrapper[3972]: E0313 10:34:30.444220 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:30.444605 master-0 kubenswrapper[3972]: E0313 10:34:30.444568 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:30.944347426 +0000 UTC m=+73.562463884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:30.444836 master-0 kubenswrapper[3972]: I0313 10:34:30.444774 3972 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 10:34:30.445595 master-0 kubenswrapper[3972]: I0313 10:34:30.445494 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.464232 master-0 kubenswrapper[3972]: I0313 10:34:30.463984 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.486211 master-0 kubenswrapper[3972]: I0313 10:34:30.486147 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.487509 master-0 kubenswrapper[3972]: I0313 10:34:30.487451 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.488134 master-0 kubenswrapper[3972]: I0313 10:34:30.488081 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sr7xx\" (UniqueName: \"kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx\") pod \"assisted-installer-controller-k96f8\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.571476 master-0 kubenswrapper[3972]: I0313 10:34:30.571396 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:34:30.583592 master-0 kubenswrapper[3972]: I0313 10:34:30.583539 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:30.591675 master-0 kubenswrapper[3972]: W0313 10:34:30.591552 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d2fdba3_9478_4165_9207_d01483625607.slice/crio-a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a WatchSource:0}: Error finding container a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a: Status 404 returned error can't find the container with id a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a Mar 13 10:34:30.600484 master-0 kubenswrapper[3972]: W0313 10:34:30.600395 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2982c23c_b1dc_4090_9de1_a5c555ac6dad.slice/crio-30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308 WatchSource:0}: Error finding container 30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308: Status 404 returned error can't find the container with id 30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308 Mar 13 10:34:30.665174 master-0 kubenswrapper[3972]: I0313 10:34:30.665078 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-k96f8" event={"ID":"2982c23c-b1dc-4090-9de1-a5c555ac6dad","Type":"ContainerStarted","Data":"30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308"} Mar 13 10:34:30.667290 master-0 kubenswrapper[3972]: I0313 10:34:30.667209 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a"} Mar 13 10:34:30.948948 master-0 kubenswrapper[3972]: I0313 10:34:30.948519 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:30.948948 master-0 kubenswrapper[3972]: E0313 10:34:30.948914 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:30.948948 master-0 kubenswrapper[3972]: E0313 10:34:30.949000 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:31.948975568 +0000 UTC m=+74.567091966 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:31.955365 master-0 kubenswrapper[3972]: I0313 10:34:31.955305 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:31.956506 master-0 kubenswrapper[3972]: E0313 10:34:31.956427 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:31.956770 master-0 kubenswrapper[3972]: E0313 10:34:31.956620 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:33.956569541 +0000 UTC m=+76.574685969 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:33.971518 master-0 kubenswrapper[3972]: I0313 10:34:33.971457 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:33.972046 master-0 kubenswrapper[3972]: E0313 10:34:33.971622 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:33.972046 master-0 kubenswrapper[3972]: E0313 10:34:33.971727 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:37.971707636 +0000 UTC m=+80.589824024 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:37.791866 master-0 kubenswrapper[3972]: I0313 10:34:37.791115 3972 generic.go:334] "Generic (PLEG): container finished" podID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerID="39e3998474ffa5421ada785b69659b745abc434915dc0302700b2f60923ba978" exitCode=0 Mar 13 10:34:37.791866 master-0 kubenswrapper[3972]: I0313 10:34:37.791141 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-k96f8" event={"ID":"2982c23c-b1dc-4090-9de1-a5c555ac6dad","Type":"ContainerDied","Data":"39e3998474ffa5421ada785b69659b745abc434915dc0302700b2f60923ba978"} Mar 13 10:34:37.794537 master-0 kubenswrapper[3972]: I0313 10:34:37.794504 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"c06a4f7f54577d80872f3a5157b329f2c2ec17e43e599b09564a82e127162989"} Mar 13 10:34:37.829840 master-0 kubenswrapper[3972]: I0313 10:34:37.821260 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" podStartSLOduration=31.672562408 podStartE2EDuration="37.821188155s" podCreationTimestamp="2026-03-13 10:34:00 +0000 UTC" firstStartedPulling="2026-03-13 10:34:30.594901575 +0000 UTC m=+73.213017993" lastFinishedPulling="2026-03-13 10:34:36.743527352 +0000 UTC m=+79.361643740" observedRunningTime="2026-03-13 10:34:37.819742289 +0000 UTC m=+80.437858677" watchObservedRunningTime="2026-03-13 10:34:37.821188155 +0000 UTC m=+80.439304543" Mar 13 10:34:37.986120 master-0 kubenswrapper[3972]: I0313 10:34:37.983925 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:37.986120 master-0 kubenswrapper[3972]: E0313 10:34:37.984284 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:37.986120 master-0 kubenswrapper[3972]: E0313 10:34:37.984385 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:45.984350547 +0000 UTC m=+88.602466965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:38.327580 master-0 kubenswrapper[3972]: I0313 10:34:38.327500 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 10:34:38.965925 master-0 kubenswrapper[3972]: I0313 10:34:38.965844 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:39.152838 master-0 kubenswrapper[3972]: I0313 10:34:39.152766 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sr7xx\" (UniqueName: \"kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx\") pod \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " Mar 13 10:34:39.152838 master-0 kubenswrapper[3972]: I0313 10:34:39.152825 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf\") pod \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " Mar 13 10:34:39.152838 master-0 kubenswrapper[3972]: I0313 10:34:39.152850 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle\") pod \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.152869 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf\") pod \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.152895 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files\") pod \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\" (UID: \"2982c23c-b1dc-4090-9de1-a5c555ac6dad\") " Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.153028 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "2982c23c-b1dc-4090-9de1-a5c555ac6dad" (UID: "2982c23c-b1dc-4090-9de1-a5c555ac6dad"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.153031 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "2982c23c-b1dc-4090-9de1-a5c555ac6dad" (UID: "2982c23c-b1dc-4090-9de1-a5c555ac6dad"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.153066 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "2982c23c-b1dc-4090-9de1-a5c555ac6dad" (UID: "2982c23c-b1dc-4090-9de1-a5c555ac6dad"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:34:39.153223 master-0 kubenswrapper[3972]: I0313 10:34:39.153162 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "2982c23c-b1dc-4090-9de1-a5c555ac6dad" (UID: "2982c23c-b1dc-4090-9de1-a5c555ac6dad"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:34:39.162735 master-0 kubenswrapper[3972]: I0313 10:34:39.162668 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx" (OuterVolumeSpecName: "kube-api-access-sr7xx") pod "2982c23c-b1dc-4090-9de1-a5c555ac6dad" (UID: "2982c23c-b1dc-4090-9de1-a5c555ac6dad"). InnerVolumeSpecName "kube-api-access-sr7xx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:34:39.254155 master-0 kubenswrapper[3972]: I0313 10:34:39.254062 3972 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sr7xx\" (UniqueName: \"kubernetes.io/projected/2982c23c-b1dc-4090-9de1-a5c555ac6dad-kube-api-access-sr7xx\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:39.254155 master-0 kubenswrapper[3972]: I0313 10:34:39.254128 3972 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:39.254155 master-0 kubenswrapper[3972]: I0313 10:34:39.254155 3972 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:39.254469 master-0 kubenswrapper[3972]: I0313 10:34:39.254168 3972 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:39.254469 master-0 kubenswrapper[3972]: I0313 10:34:39.254182 3972 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/2982c23c-b1dc-4090-9de1-a5c555ac6dad-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:39.290459 master-0 kubenswrapper[3972]: I0313 10:34:39.290363 3972 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:34:39.738011 master-0 kubenswrapper[3972]: I0313 10:34:39.737915 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=1.7378725940000002 podStartE2EDuration="1.737872594s" podCreationTimestamp="2026-03-13 10:34:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:34:38.998319957 +0000 UTC m=+81.616436405" watchObservedRunningTime="2026-03-13 10:34:39.737872594 +0000 UTC m=+82.355988992" Mar 13 10:34:39.738260 master-0 kubenswrapper[3972]: I0313 10:34:39.738142 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-2lpv4"] Mar 13 10:34:39.738353 master-0 kubenswrapper[3972]: E0313 10:34:39.738324 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:34:39.738413 master-0 kubenswrapper[3972]: I0313 10:34:39.738383 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:34:39.738513 master-0 kubenswrapper[3972]: I0313 10:34:39.738478 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:34:39.738839 master-0 kubenswrapper[3972]: I0313 10:34:39.738801 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:39.758529 master-0 kubenswrapper[3972]: I0313 10:34:39.758472 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgmh6\" (UniqueName: \"kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6\") pod \"mtu-prober-2lpv4\" (UID: \"6c9c8030-b756-4ec4-b585-19672dc61df1\") " pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:39.801247 master-0 kubenswrapper[3972]: I0313 10:34:39.801170 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-k96f8" event={"ID":"2982c23c-b1dc-4090-9de1-a5c555ac6dad","Type":"ContainerDied","Data":"30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308"} Mar 13 10:34:39.801247 master-0 kubenswrapper[3972]: I0313 10:34:39.801235 3972 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308" Mar 13 10:34:39.801539 master-0 kubenswrapper[3972]: I0313 10:34:39.801376 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:34:39.859539 master-0 kubenswrapper[3972]: I0313 10:34:39.859471 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgmh6\" (UniqueName: \"kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6\") pod \"mtu-prober-2lpv4\" (UID: \"6c9c8030-b756-4ec4-b585-19672dc61df1\") " pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:39.879057 master-0 kubenswrapper[3972]: I0313 10:34:39.878962 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgmh6\" (UniqueName: \"kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6\") pod \"mtu-prober-2lpv4\" (UID: \"6c9c8030-b756-4ec4-b585-19672dc61df1\") " pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:40.057143 master-0 kubenswrapper[3972]: I0313 10:34:40.056959 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:40.071389 master-0 kubenswrapper[3972]: W0313 10:34:40.071263 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c9c8030_b756_4ec4_b585_19672dc61df1.slice/crio-a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade WatchSource:0}: Error finding container a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade: Status 404 returned error can't find the container with id a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade Mar 13 10:34:40.807463 master-0 kubenswrapper[3972]: I0313 10:34:40.807273 3972 generic.go:334] "Generic (PLEG): container finished" podID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerID="34b7e36b0204fb75f5eaa9ffadb1e13d0888ef1773ea6fc2201df90d0a2dcd5e" exitCode=0 Mar 13 10:34:40.807463 master-0 kubenswrapper[3972]: I0313 10:34:40.807331 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-2lpv4" event={"ID":"6c9c8030-b756-4ec4-b585-19672dc61df1","Type":"ContainerDied","Data":"34b7e36b0204fb75f5eaa9ffadb1e13d0888ef1773ea6fc2201df90d0a2dcd5e"} Mar 13 10:34:40.807463 master-0 kubenswrapper[3972]: I0313 10:34:40.807361 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-2lpv4" event={"ID":"6c9c8030-b756-4ec4-b585-19672dc61df1","Type":"ContainerStarted","Data":"a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade"} Mar 13 10:34:41.834510 master-0 kubenswrapper[3972]: I0313 10:34:41.834226 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:41.871861 master-0 kubenswrapper[3972]: I0313 10:34:41.871755 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hgmh6\" (UniqueName: \"kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6\") pod \"6c9c8030-b756-4ec4-b585-19672dc61df1\" (UID: \"6c9c8030-b756-4ec4-b585-19672dc61df1\") " Mar 13 10:34:41.874985 master-0 kubenswrapper[3972]: I0313 10:34:41.874927 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6" (OuterVolumeSpecName: "kube-api-access-hgmh6") pod "6c9c8030-b756-4ec4-b585-19672dc61df1" (UID: "6c9c8030-b756-4ec4-b585-19672dc61df1"). InnerVolumeSpecName "kube-api-access-hgmh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:34:41.972958 master-0 kubenswrapper[3972]: I0313 10:34:41.972816 3972 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hgmh6\" (UniqueName: \"kubernetes.io/projected/6c9c8030-b756-4ec4-b585-19672dc61df1-kube-api-access-hgmh6\") on node \"master-0\" DevicePath \"\"" Mar 13 10:34:42.816145 master-0 kubenswrapper[3972]: I0313 10:34:42.816009 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-2lpv4" event={"ID":"6c9c8030-b756-4ec4-b585-19672dc61df1","Type":"ContainerDied","Data":"a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade"} Mar 13 10:34:42.816145 master-0 kubenswrapper[3972]: I0313 10:34:42.816088 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-2lpv4" Mar 13 10:34:42.816567 master-0 kubenswrapper[3972]: I0313 10:34:42.816085 3972 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade" Mar 13 10:34:44.742667 master-0 kubenswrapper[3972]: I0313 10:34:44.742597 3972 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-2lpv4"] Mar 13 10:34:44.746326 master-0 kubenswrapper[3972]: I0313 10:34:44.746242 3972 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-2lpv4"] Mar 13 10:34:46.000559 master-0 kubenswrapper[3972]: I0313 10:34:46.000330 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:34:46.001993 master-0 kubenswrapper[3972]: E0313 10:34:46.000765 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:46.001993 master-0 kubenswrapper[3972]: E0313 10:34:46.001001 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:02.000940785 +0000 UTC m=+104.619057253 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:34:46.318043 master-0 kubenswrapper[3972]: I0313 10:34:46.317860 3972 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" path="/var/lib/kubelet/pods/6c9c8030-b756-4ec4-b585-19672dc61df1/volumes" Mar 13 10:34:46.322492 master-0 kubenswrapper[3972]: I0313 10:34:46.322450 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 10:34:48.331415 master-0 kubenswrapper[3972]: I0313 10:34:48.331311 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=2.3312942469999998 podStartE2EDuration="2.331294247s" podCreationTimestamp="2026-03-13 10:34:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:34:48.329744849 +0000 UTC m=+90.947861237" watchObservedRunningTime="2026-03-13 10:34:48.331294247 +0000 UTC m=+90.949410635" Mar 13 10:34:49.627773 master-0 kubenswrapper[3972]: I0313 10:34:49.627683 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bjv5r"] Mar 13 10:34:49.628691 master-0 kubenswrapper[3972]: E0313 10:34:49.627885 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:34:49.628691 master-0 kubenswrapper[3972]: I0313 10:34:49.627919 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:34:49.628691 master-0 kubenswrapper[3972]: I0313 10:34:49.627972 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:34:49.628691 master-0 kubenswrapper[3972]: I0313 10:34:49.628262 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.631677 master-0 kubenswrapper[3972]: I0313 10:34:49.631625 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:34:49.631977 master-0 kubenswrapper[3972]: I0313 10:34:49.631933 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:34:49.632300 master-0 kubenswrapper[3972]: I0313 10:34:49.632264 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:34:49.632639 master-0 kubenswrapper[3972]: I0313 10:34:49.632599 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:34:49.639374 master-0 kubenswrapper[3972]: I0313 10:34:49.639276 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639591 master-0 kubenswrapper[3972]: I0313 10:34:49.639388 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639591 master-0 kubenswrapper[3972]: I0313 10:34:49.639427 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639591 master-0 kubenswrapper[3972]: I0313 10:34:49.639452 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639591 master-0 kubenswrapper[3972]: I0313 10:34:49.639536 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639673 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639741 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639788 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639820 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639854 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639894 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.639956 master-0 kubenswrapper[3972]: I0313 10:34:49.639924 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.640550 master-0 kubenswrapper[3972]: I0313 10:34:49.639960 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.640550 master-0 kubenswrapper[3972]: I0313 10:34:49.640000 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.640550 master-0 kubenswrapper[3972]: I0313 10:34:49.640032 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.640550 master-0 kubenswrapper[3972]: I0313 10:34:49.640086 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.640550 master-0 kubenswrapper[3972]: I0313 10:34:49.640149 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740592 master-0 kubenswrapper[3972]: I0313 10:34:49.740517 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740592 master-0 kubenswrapper[3972]: I0313 10:34:49.740569 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740620 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740638 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740665 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740693 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740709 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740724 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740738 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740770 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740784 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740798 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740813 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740826 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.740894 master-0 kubenswrapper[3972]: I0313 10:34:49.740896 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741557 master-0 kubenswrapper[3972]: I0313 10:34:49.740919 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741557 master-0 kubenswrapper[3972]: I0313 10:34:49.740934 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741557 master-0 kubenswrapper[3972]: I0313 10:34:49.740995 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741557 master-0 kubenswrapper[3972]: I0313 10:34:49.741030 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741557 master-0 kubenswrapper[3972]: I0313 10:34:49.741049 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741780 master-0 kubenswrapper[3972]: I0313 10:34:49.741692 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741832 master-0 kubenswrapper[3972]: I0313 10:34:49.741703 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741832 master-0 kubenswrapper[3972]: I0313 10:34:49.741725 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741931 master-0 kubenswrapper[3972]: I0313 10:34:49.741843 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741931 master-0 kubenswrapper[3972]: I0313 10:34:49.741843 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741931 master-0 kubenswrapper[3972]: I0313 10:34:49.741877 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.741931 master-0 kubenswrapper[3972]: I0313 10:34:49.741907 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742121 master-0 kubenswrapper[3972]: I0313 10:34:49.741943 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742230 master-0 kubenswrapper[3972]: I0313 10:34:49.742171 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742329 master-0 kubenswrapper[3972]: I0313 10:34:49.742299 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742424 master-0 kubenswrapper[3972]: I0313 10:34:49.742384 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742573 master-0 kubenswrapper[3972]: I0313 10:34:49.742443 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.742823 master-0 kubenswrapper[3972]: I0313 10:34:49.742793 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.879107 master-0 kubenswrapper[3972]: I0313 10:34:49.878910 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:34:49.899255 master-0 kubenswrapper[3972]: I0313 10:34:49.899189 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-72t2n"] Mar 13 10:34:49.905922 master-0 kubenswrapper[3972]: I0313 10:34:49.904797 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:49.908220 master-0 kubenswrapper[3972]: I0313 10:34:49.908037 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:34:49.908220 master-0 kubenswrapper[3972]: I0313 10:34:49.908049 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:34:49.948213 master-0 kubenswrapper[3972]: I0313 10:34:49.948150 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bjv5r" Mar 13 10:34:50.060256 master-0 kubenswrapper[3972]: I0313 10:34:50.060153 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060641 master-0 kubenswrapper[3972]: I0313 10:34:50.060341 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060641 master-0 kubenswrapper[3972]: I0313 10:34:50.060457 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060641 master-0 kubenswrapper[3972]: I0313 10:34:50.060508 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060641 master-0 kubenswrapper[3972]: I0313 10:34:50.060580 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060962 master-0 kubenswrapper[3972]: I0313 10:34:50.060711 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060962 master-0 kubenswrapper[3972]: I0313 10:34:50.060761 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.060962 master-0 kubenswrapper[3972]: I0313 10:34:50.060795 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162072 master-0 kubenswrapper[3972]: I0313 10:34:50.161826 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162436 master-0 kubenswrapper[3972]: I0313 10:34:50.162192 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162436 master-0 kubenswrapper[3972]: I0313 10:34:50.162325 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162436 master-0 kubenswrapper[3972]: I0313 10:34:50.162385 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162436 master-0 kubenswrapper[3972]: I0313 10:34:50.162433 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162508 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162563 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162623 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162652 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162675 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.162928 master-0 kubenswrapper[3972]: I0313 10:34:50.162727 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.163516 master-0 kubenswrapper[3972]: I0313 10:34:50.162961 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.163516 master-0 kubenswrapper[3972]: I0313 10:34:50.163370 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.163715 master-0 kubenswrapper[3972]: I0313 10:34:50.163529 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.164131 master-0 kubenswrapper[3972]: I0313 10:34:50.164042 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.188214 master-0 kubenswrapper[3972]: I0313 10:34:50.188080 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.218739 master-0 kubenswrapper[3972]: I0313 10:34:50.218657 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:34:50.235799 master-0 kubenswrapper[3972]: W0313 10:34:50.235735 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc66541c_6410_4824_b173_53747069429e.slice/crio-ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829 WatchSource:0}: Error finding container ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829: Status 404 returned error can't find the container with id ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829 Mar 13 10:34:50.605034 master-0 kubenswrapper[3972]: I0313 10:34:50.604893 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-c5vhc"] Mar 13 10:34:50.605667 master-0 kubenswrapper[3972]: I0313 10:34:50.605634 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.605818 master-0 kubenswrapper[3972]: E0313 10:34:50.605771 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:34:50.668430 master-0 kubenswrapper[3972]: I0313 10:34:50.668012 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.668430 master-0 kubenswrapper[3972]: I0313 10:34:50.668426 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.769603 master-0 kubenswrapper[3972]: I0313 10:34:50.769483 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.769603 master-0 kubenswrapper[3972]: I0313 10:34:50.769581 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.770009 master-0 kubenswrapper[3972]: E0313 10:34:50.769844 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:50.770009 master-0 kubenswrapper[3972]: E0313 10:34:50.769957 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:51.269911848 +0000 UTC m=+93.888028276 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:50.798859 master-0 kubenswrapper[3972]: I0313 10:34:50.798807 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:50.865369 master-0 kubenswrapper[3972]: I0313 10:34:50.865246 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjv5r" event={"ID":"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc","Type":"ContainerStarted","Data":"26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19"} Mar 13 10:34:50.866151 master-0 kubenswrapper[3972]: I0313 10:34:50.866118 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerStarted","Data":"ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829"} Mar 13 10:34:51.273361 master-0 kubenswrapper[3972]: I0313 10:34:51.273278 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:51.273730 master-0 kubenswrapper[3972]: E0313 10:34:51.273475 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:51.273730 master-0 kubenswrapper[3972]: E0313 10:34:51.273557 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:52.273537326 +0000 UTC m=+94.891653714 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:51.321081 master-0 kubenswrapper[3972]: W0313 10:34:51.320693 3972 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 10:34:51.321333 master-0 kubenswrapper[3972]: I0313 10:34:51.321147 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 10:34:52.279163 master-0 kubenswrapper[3972]: I0313 10:34:52.279009 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:52.280159 master-0 kubenswrapper[3972]: E0313 10:34:52.279280 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:52.280159 master-0 kubenswrapper[3972]: E0313 10:34:52.279443 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:54.279394916 +0000 UTC m=+96.897511304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:52.308732 master-0 kubenswrapper[3972]: I0313 10:34:52.308653 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:52.308988 master-0 kubenswrapper[3972]: E0313 10:34:52.308868 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:34:53.874618 master-0 kubenswrapper[3972]: I0313 10:34:53.874564 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerStarted","Data":"8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf"} Mar 13 10:34:54.255992 master-0 kubenswrapper[3972]: I0313 10:34:54.255874 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=3.255829229 podStartE2EDuration="3.255829229s" podCreationTimestamp="2026-03-13 10:34:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:34:54.255330546 +0000 UTC m=+96.873446984" watchObservedRunningTime="2026-03-13 10:34:54.255829229 +0000 UTC m=+96.873945637" Mar 13 10:34:54.302746 master-0 kubenswrapper[3972]: I0313 10:34:54.302641 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:54.303011 master-0 kubenswrapper[3972]: E0313 10:34:54.302953 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:54.303167 master-0 kubenswrapper[3972]: E0313 10:34:54.303075 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:34:58.303040501 +0000 UTC m=+100.921156919 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:54.309266 master-0 kubenswrapper[3972]: I0313 10:34:54.309224 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:54.309412 master-0 kubenswrapper[3972]: E0313 10:34:54.309387 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:34:54.881119 master-0 kubenswrapper[3972]: I0313 10:34:54.881056 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf" exitCode=0 Mar 13 10:34:54.881568 master-0 kubenswrapper[3972]: I0313 10:34:54.881123 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf"} Mar 13 10:34:56.316032 master-0 kubenswrapper[3972]: I0313 10:34:56.313888 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:56.316032 master-0 kubenswrapper[3972]: E0313 10:34:56.314127 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:34:58.309184 master-0 kubenswrapper[3972]: I0313 10:34:58.308600 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:58.310616 master-0 kubenswrapper[3972]: E0313 10:34:58.309464 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:34:58.332350 master-0 kubenswrapper[3972]: I0313 10:34:58.332273 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:34:58.332749 master-0 kubenswrapper[3972]: E0313 10:34:58.332616 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:34:58.332749 master-0 kubenswrapper[3972]: E0313 10:34:58.332733 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:06.332708026 +0000 UTC m=+108.950824414 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:35:00.326396 master-0 kubenswrapper[3972]: I0313 10:35:00.326320 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:00.327813 master-0 kubenswrapper[3972]: E0313 10:35:00.326512 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:02.024427 master-0 kubenswrapper[3972]: I0313 10:35:02.024373 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7"] Mar 13 10:35:02.025063 master-0 kubenswrapper[3972]: I0313 10:35:02.024965 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.027698 master-0 kubenswrapper[3972]: I0313 10:35:02.027628 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:35:02.027946 master-0 kubenswrapper[3972]: I0313 10:35:02.027906 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:35:02.028538 master-0 kubenswrapper[3972]: I0313 10:35:02.028419 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:35:02.028538 master-0 kubenswrapper[3972]: I0313 10:35:02.028466 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:35:02.028892 master-0 kubenswrapper[3972]: I0313 10:35:02.028420 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: I0313 10:35:02.042992 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: I0313 10:35:02.043041 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: I0313 10:35:02.043066 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: I0313 10:35:02.043089 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: I0313 10:35:02.043127 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: E0313 10:35:02.043469 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:35:02.046147 master-0 kubenswrapper[3972]: E0313 10:35:02.043555 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:34.043535021 +0000 UTC m=+136.661651409 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:35:02.144012 master-0 kubenswrapper[3972]: I0313 10:35:02.143916 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.144012 master-0 kubenswrapper[3972]: I0313 10:35:02.143992 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.146683 master-0 kubenswrapper[3972]: I0313 10:35:02.144024 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.146683 master-0 kubenswrapper[3972]: I0313 10:35:02.144066 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.147291 master-0 kubenswrapper[3972]: I0313 10:35:02.147213 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.147582 master-0 kubenswrapper[3972]: I0313 10:35:02.147517 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.152059 master-0 kubenswrapper[3972]: I0313 10:35:02.152000 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.165919 master-0 kubenswrapper[3972]: I0313 10:35:02.162260 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.228075 master-0 kubenswrapper[3972]: I0313 10:35:02.227984 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2q2tp"] Mar 13 10:35:02.229632 master-0 kubenswrapper[3972]: I0313 10:35:02.229595 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.231840 master-0 kubenswrapper[3972]: I0313 10:35:02.231807 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:35:02.233415 master-0 kubenswrapper[3972]: I0313 10:35:02.233379 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:35:02.310561 master-0 kubenswrapper[3972]: I0313 10:35:02.310418 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:02.310790 master-0 kubenswrapper[3972]: E0313 10:35:02.310672 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:02.343177 master-0 kubenswrapper[3972]: I0313 10:35:02.343049 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:35:02.345554 master-0 kubenswrapper[3972]: I0313 10:35:02.345496 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345620 master-0 kubenswrapper[3972]: I0313 10:35:02.345559 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-klpg7\" (UniqueName: \"kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345620 master-0 kubenswrapper[3972]: I0313 10:35:02.345588 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345735 master-0 kubenswrapper[3972]: I0313 10:35:02.345668 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345797 master-0 kubenswrapper[3972]: I0313 10:35:02.345757 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345846 master-0 kubenswrapper[3972]: I0313 10:35:02.345819 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345935 master-0 kubenswrapper[3972]: I0313 10:35:02.345843 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345935 master-0 kubenswrapper[3972]: I0313 10:35:02.345865 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.345935 master-0 kubenswrapper[3972]: I0313 10:35:02.345891 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346090 master-0 kubenswrapper[3972]: I0313 10:35:02.345940 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346090 master-0 kubenswrapper[3972]: I0313 10:35:02.345973 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346090 master-0 kubenswrapper[3972]: I0313 10:35:02.346017 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346090 master-0 kubenswrapper[3972]: I0313 10:35:02.346059 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346317 master-0 kubenswrapper[3972]: I0313 10:35:02.346149 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346317 master-0 kubenswrapper[3972]: I0313 10:35:02.346187 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346317 master-0 kubenswrapper[3972]: I0313 10:35:02.346234 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346317 master-0 kubenswrapper[3972]: I0313 10:35:02.346275 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346317 master-0 kubenswrapper[3972]: I0313 10:35:02.346298 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346745 master-0 kubenswrapper[3972]: I0313 10:35:02.346343 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.346745 master-0 kubenswrapper[3972]: I0313 10:35:02.346380 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.446782 master-0 kubenswrapper[3972]: I0313 10:35:02.446718 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.446782 master-0 kubenswrapper[3972]: I0313 10:35:02.446776 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.446782 master-0 kubenswrapper[3972]: I0313 10:35:02.446800 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447152 master-0 kubenswrapper[3972]: I0313 10:35:02.446866 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447152 master-0 kubenswrapper[3972]: I0313 10:35:02.446878 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447152 master-0 kubenswrapper[3972]: I0313 10:35:02.446892 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447152 master-0 kubenswrapper[3972]: I0313 10:35:02.447136 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447165 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447185 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447205 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447233 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447250 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447274 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447333 master-0 kubenswrapper[3972]: I0313 10:35:02.447325 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447343 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447359 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447375 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447389 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447405 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447420 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447435 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447449 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-klpg7\" (UniqueName: \"kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447482 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447504 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447548 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447574 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447599 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447619 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447642 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.447691 master-0 kubenswrapper[3972]: I0313 10:35:02.447669 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448424 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448488 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448500 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448556 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448593 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448648 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.448682 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.449360 master-0 kubenswrapper[3972]: I0313 10:35:02.449307 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.450973 master-0 kubenswrapper[3972]: I0313 10:35:02.450934 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.515907 master-0 kubenswrapper[3972]: I0313 10:35:02.515853 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-klpg7\" (UniqueName: \"kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7\") pod \"ovnkube-node-2q2tp\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:02.547997 master-0 kubenswrapper[3972]: I0313 10:35:02.547730 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:04.309411 master-0 kubenswrapper[3972]: I0313 10:35:04.309327 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:04.310244 master-0 kubenswrapper[3972]: E0313 10:35:04.309542 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:04.995117 master-0 kubenswrapper[3972]: W0313 10:35:04.995015 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfb154e7_a689_4694_a500_cb76a91d924f.slice/crio-e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac WatchSource:0}: Error finding container e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac: Status 404 returned error can't find the container with id e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac Mar 13 10:35:05.002409 master-0 kubenswrapper[3972]: W0313 10:35:05.002368 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod193b3b95_f9a3_4272_853b_86366ce348a2.slice/crio-7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0 WatchSource:0}: Error finding container 7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0: Status 404 returned error can't find the container with id 7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0 Mar 13 10:35:05.285236 master-0 kubenswrapper[3972]: I0313 10:35:05.284821 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-jwfjl"] Mar 13 10:35:05.285511 master-0 kubenswrapper[3972]: I0313 10:35:05.285282 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:05.285511 master-0 kubenswrapper[3972]: E0313 10:35:05.285343 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:05.471207 master-0 kubenswrapper[3972]: I0313 10:35:05.471077 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:05.572056 master-0 kubenswrapper[3972]: I0313 10:35:05.571999 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:05.590317 master-0 kubenswrapper[3972]: E0313 10:35:05.590240 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:05.590317 master-0 kubenswrapper[3972]: E0313 10:35:05.590290 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:05.590317 master-0 kubenswrapper[3972]: E0313 10:35:05.590314 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:05.590652 master-0 kubenswrapper[3972]: E0313 10:35:05.590386 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:06.090366574 +0000 UTC m=+108.708482972 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:05.913386 master-0 kubenswrapper[3972]: I0313 10:35:05.913059 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c0bf4ee121253f4acc846c62a0fe4a189d6104b07034617c1152a5f95507935c" exitCode=0 Mar 13 10:35:05.913622 master-0 kubenswrapper[3972]: I0313 10:35:05.913123 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"c0bf4ee121253f4acc846c62a0fe4a189d6104b07034617c1152a5f95507935c"} Mar 13 10:35:05.917577 master-0 kubenswrapper[3972]: I0313 10:35:05.917519 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"b824573f6b95b2e21d36b9d4c1faa0ee0ac02b8c48ac4481752faf216bc6b459"} Mar 13 10:35:05.917577 master-0 kubenswrapper[3972]: I0313 10:35:05.917563 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0"} Mar 13 10:35:05.920302 master-0 kubenswrapper[3972]: I0313 10:35:05.920263 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac"} Mar 13 10:35:05.922195 master-0 kubenswrapper[3972]: I0313 10:35:05.922172 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjv5r" event={"ID":"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc","Type":"ContainerStarted","Data":"d2a94cae7314d31af0d86ce94f25ddeb94ed3dfcd2a8de1530f6be8d77df9d59"} Mar 13 10:35:05.948962 master-0 kubenswrapper[3972]: I0313 10:35:05.948274 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bjv5r" podStartSLOduration=1.812438594 podStartE2EDuration="16.948255952s" podCreationTimestamp="2026-03-13 10:34:49 +0000 UTC" firstStartedPulling="2026-03-13 10:34:49.965296406 +0000 UTC m=+92.583412794" lastFinishedPulling="2026-03-13 10:35:05.101113764 +0000 UTC m=+107.719230152" observedRunningTime="2026-03-13 10:35:05.947683058 +0000 UTC m=+108.565799446" watchObservedRunningTime="2026-03-13 10:35:05.948255952 +0000 UTC m=+108.566372340" Mar 13 10:35:06.194223 master-0 kubenswrapper[3972]: I0313 10:35:06.192611 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:06.194223 master-0 kubenswrapper[3972]: E0313 10:35:06.192928 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:06.194223 master-0 kubenswrapper[3972]: E0313 10:35:06.192966 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:06.194223 master-0 kubenswrapper[3972]: E0313 10:35:06.192988 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:06.194223 master-0 kubenswrapper[3972]: E0313 10:35:06.193075 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:07.193047421 +0000 UTC m=+109.811163799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:06.309592 master-0 kubenswrapper[3972]: I0313 10:35:06.309511 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:06.309809 master-0 kubenswrapper[3972]: E0313 10:35:06.309653 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:06.409125 master-0 kubenswrapper[3972]: I0313 10:35:06.405449 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:06.409125 master-0 kubenswrapper[3972]: E0313 10:35:06.405686 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:35:06.409125 master-0 kubenswrapper[3972]: E0313 10:35:06.405754 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:22.405732853 +0000 UTC m=+125.023849261 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:35:07.212587 master-0 kubenswrapper[3972]: I0313 10:35:07.212514 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:07.213209 master-0 kubenswrapper[3972]: E0313 10:35:07.212693 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:07.213209 master-0 kubenswrapper[3972]: E0313 10:35:07.212715 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:07.213209 master-0 kubenswrapper[3972]: E0313 10:35:07.212728 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:07.213209 master-0 kubenswrapper[3972]: E0313 10:35:07.212787 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:09.212770494 +0000 UTC m=+111.830886892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:07.526212 master-0 kubenswrapper[3972]: I0313 10:35:07.526040 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:07.526474 master-0 kubenswrapper[3972]: E0313 10:35:07.526258 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:07.832315 master-0 kubenswrapper[3972]: I0313 10:35:07.830668 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-hkjrg"] Mar 13 10:35:07.832315 master-0 kubenswrapper[3972]: I0313 10:35:07.831136 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:07.835659 master-0 kubenswrapper[3972]: I0313 10:35:07.835607 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:35:07.835988 master-0 kubenswrapper[3972]: I0313 10:35:07.835946 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:35:07.836517 master-0 kubenswrapper[3972]: I0313 10:35:07.836451 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:35:07.836836 master-0 kubenswrapper[3972]: I0313 10:35:07.836770 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:35:07.837164 master-0 kubenswrapper[3972]: I0313 10:35:07.837118 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:35:08.022041 master-0 kubenswrapper[3972]: I0313 10:35:08.021985 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.022346 master-0 kubenswrapper[3972]: I0313 10:35:08.022064 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.022346 master-0 kubenswrapper[3972]: I0313 10:35:08.022118 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.022346 master-0 kubenswrapper[3972]: I0313 10:35:08.022212 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.123255 master-0 kubenswrapper[3972]: I0313 10:35:08.123195 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.123468 master-0 kubenswrapper[3972]: I0313 10:35:08.123352 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.123468 master-0 kubenswrapper[3972]: I0313 10:35:08.123430 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.123468 master-0 kubenswrapper[3972]: I0313 10:35:08.123452 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.124809 master-0 kubenswrapper[3972]: I0313 10:35:08.124763 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.124917 master-0 kubenswrapper[3972]: I0313 10:35:08.124875 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.127826 master-0 kubenswrapper[3972]: I0313 10:35:08.127799 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.309670 master-0 kubenswrapper[3972]: I0313 10:35:08.309592 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:08.310641 master-0 kubenswrapper[3972]: E0313 10:35:08.309709 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:08.623582 master-0 kubenswrapper[3972]: I0313 10:35:08.623547 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.752663 master-0 kubenswrapper[3972]: I0313 10:35:08.752519 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:35:08.764516 master-0 kubenswrapper[3972]: W0313 10:35:08.764446 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda3c91eef_ec46_419f_b418_ac3a8094b77d.slice/crio-45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5 WatchSource:0}: Error finding container 45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5: Status 404 returned error can't find the container with id 45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5 Mar 13 10:35:08.931246 master-0 kubenswrapper[3972]: I0313 10:35:08.931125 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5"} Mar 13 10:35:09.232742 master-0 kubenswrapper[3972]: I0313 10:35:09.232641 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:09.232938 master-0 kubenswrapper[3972]: E0313 10:35:09.232789 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:09.232938 master-0 kubenswrapper[3972]: E0313 10:35:09.232809 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:09.232938 master-0 kubenswrapper[3972]: E0313 10:35:09.232822 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:09.232938 master-0 kubenswrapper[3972]: E0313 10:35:09.232871 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:13.232855242 +0000 UTC m=+115.850971630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:09.309141 master-0 kubenswrapper[3972]: I0313 10:35:09.309067 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:09.309384 master-0 kubenswrapper[3972]: E0313 10:35:09.309230 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:10.308795 master-0 kubenswrapper[3972]: I0313 10:35:10.308740 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:10.309301 master-0 kubenswrapper[3972]: E0313 10:35:10.308900 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:11.308994 master-0 kubenswrapper[3972]: I0313 10:35:11.308744 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:11.308994 master-0 kubenswrapper[3972]: E0313 10:35:11.308932 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:12.309319 master-0 kubenswrapper[3972]: I0313 10:35:12.309263 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:12.312359 master-0 kubenswrapper[3972]: E0313 10:35:12.309431 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:12.323323 master-0 kubenswrapper[3972]: I0313 10:35:12.323252 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 10:35:12.946599 master-0 kubenswrapper[3972]: I0313 10:35:12.946548 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="dc84ce423f666bcd523a540ff225040b69d4425d2faf8d523c79672591bd3375" exitCode=0 Mar 13 10:35:12.946836 master-0 kubenswrapper[3972]: I0313 10:35:12.946647 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"dc84ce423f666bcd523a540ff225040b69d4425d2faf8d523c79672591bd3375"} Mar 13 10:35:12.968166 master-0 kubenswrapper[3972]: I0313 10:35:12.967790 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=0.967769025 podStartE2EDuration="967.769025ms" podCreationTimestamp="2026-03-13 10:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:35:12.967652503 +0000 UTC m=+115.585768901" watchObservedRunningTime="2026-03-13 10:35:12.967769025 +0000 UTC m=+115.585885413" Mar 13 10:35:13.273563 master-0 kubenswrapper[3972]: I0313 10:35:13.273374 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:13.273563 master-0 kubenswrapper[3972]: E0313 10:35:13.273542 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:13.273933 master-0 kubenswrapper[3972]: E0313 10:35:13.273585 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:13.273933 master-0 kubenswrapper[3972]: E0313 10:35:13.273598 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:13.273933 master-0 kubenswrapper[3972]: E0313 10:35:13.273648 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:21.273633171 +0000 UTC m=+123.891749559 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:13.308630 master-0 kubenswrapper[3972]: I0313 10:35:13.308548 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:13.308933 master-0 kubenswrapper[3972]: E0313 10:35:13.308692 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:14.309753 master-0 kubenswrapper[3972]: I0313 10:35:14.309571 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:14.309753 master-0 kubenswrapper[3972]: E0313 10:35:14.309697 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:14.954844 master-0 kubenswrapper[3972]: I0313 10:35:14.954777 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ffc23a177a087ad146cddc2bc253947b08886f41c707f8ee47efc6dd4d3c5c8e" exitCode=0 Mar 13 10:35:14.954844 master-0 kubenswrapper[3972]: I0313 10:35:14.954832 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"ffc23a177a087ad146cddc2bc253947b08886f41c707f8ee47efc6dd4d3c5c8e"} Mar 13 10:35:15.309041 master-0 kubenswrapper[3972]: I0313 10:35:15.308913 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:15.309235 master-0 kubenswrapper[3972]: E0313 10:35:15.309039 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:16.308840 master-0 kubenswrapper[3972]: I0313 10:35:16.308761 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:16.309779 master-0 kubenswrapper[3972]: E0313 10:35:16.309010 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:17.308815 master-0 kubenswrapper[3972]: I0313 10:35:17.308758 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:17.309072 master-0 kubenswrapper[3972]: E0313 10:35:17.308897 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:17.322913 master-0 kubenswrapper[3972]: I0313 10:35:17.322856 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 13 10:35:18.185157 master-0 kubenswrapper[3972]: E0313 10:35:18.185086 3972 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 13 10:35:18.258463 master-0 kubenswrapper[3972]: E0313 10:35:18.258393 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:18.308905 master-0 kubenswrapper[3972]: I0313 10:35:18.308853 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:18.310295 master-0 kubenswrapper[3972]: E0313 10:35:18.310240 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:18.322261 master-0 kubenswrapper[3972]: I0313 10:35:18.322181 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=1.322163508 podStartE2EDuration="1.322163508s" podCreationTimestamp="2026-03-13 10:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:35:18.321408189 +0000 UTC m=+120.939524577" watchObservedRunningTime="2026-03-13 10:35:18.322163508 +0000 UTC m=+120.940279896" Mar 13 10:35:19.308812 master-0 kubenswrapper[3972]: I0313 10:35:19.308757 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:19.309039 master-0 kubenswrapper[3972]: E0313 10:35:19.308901 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:20.309628 master-0 kubenswrapper[3972]: I0313 10:35:20.309561 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:20.310141 master-0 kubenswrapper[3972]: E0313 10:35:20.309799 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:21.309109 master-0 kubenswrapper[3972]: I0313 10:35:21.309044 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:21.309336 master-0 kubenswrapper[3972]: E0313 10:35:21.309210 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:21.350127 master-0 kubenswrapper[3972]: I0313 10:35:21.350032 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:21.351456 master-0 kubenswrapper[3972]: E0313 10:35:21.350204 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:21.351456 master-0 kubenswrapper[3972]: E0313 10:35:21.350222 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:21.351456 master-0 kubenswrapper[3972]: E0313 10:35:21.350231 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:21.351456 master-0 kubenswrapper[3972]: E0313 10:35:21.350281 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:35:37.350266019 +0000 UTC m=+139.968382407 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:21.973760 master-0 kubenswrapper[3972]: I0313 10:35:21.973701 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac"} Mar 13 10:35:21.975345 master-0 kubenswrapper[3972]: I0313 10:35:21.975295 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" exitCode=0 Mar 13 10:35:21.975445 master-0 kubenswrapper[3972]: I0313 10:35:21.975360 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} Mar 13 10:35:21.991610 master-0 kubenswrapper[3972]: I0313 10:35:21.991515 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" podStartSLOduration=4.7264609140000005 podStartE2EDuration="20.991494844s" podCreationTimestamp="2026-03-13 10:35:01 +0000 UTC" firstStartedPulling="2026-03-13 10:35:05.290265391 +0000 UTC m=+107.908381779" lastFinishedPulling="2026-03-13 10:35:21.555299321 +0000 UTC m=+124.173415709" observedRunningTime="2026-03-13 10:35:21.990754165 +0000 UTC m=+124.608870553" watchObservedRunningTime="2026-03-13 10:35:21.991494844 +0000 UTC m=+124.609611242" Mar 13 10:35:22.309996 master-0 kubenswrapper[3972]: I0313 10:35:22.309683 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:22.310195 master-0 kubenswrapper[3972]: E0313 10:35:22.310076 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:22.459112 master-0 kubenswrapper[3972]: I0313 10:35:22.458991 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:22.460185 master-0 kubenswrapper[3972]: E0313 10:35:22.459316 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:35:22.460185 master-0 kubenswrapper[3972]: E0313 10:35:22.459405 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.459380123 +0000 UTC m=+157.077496551 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983459 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983512 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983524 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983535 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983543 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:22.983522 master-0 kubenswrapper[3972]: I0313 10:35:22.983551 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:22.985687 master-0 kubenswrapper[3972]: I0313 10:35:22.985594 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248"} Mar 13 10:35:22.985687 master-0 kubenswrapper[3972]: I0313 10:35:22.985663 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"8ab405ef0e7b542476b55860f034ef7404421d6b9bac08317c0aa8791073c002"} Mar 13 10:35:23.002978 master-0 kubenswrapper[3972]: I0313 10:35:23.002467 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-hkjrg" podStartSLOduration=2.817235995 podStartE2EDuration="16.00243539s" podCreationTimestamp="2026-03-13 10:35:07 +0000 UTC" firstStartedPulling="2026-03-13 10:35:08.766666545 +0000 UTC m=+111.384782933" lastFinishedPulling="2026-03-13 10:35:21.95186592 +0000 UTC m=+124.569982328" observedRunningTime="2026-03-13 10:35:23.001565938 +0000 UTC m=+125.619682326" watchObservedRunningTime="2026-03-13 10:35:23.00243539 +0000 UTC m=+125.620551808" Mar 13 10:35:23.260211 master-0 kubenswrapper[3972]: E0313 10:35:23.260023 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:23.308788 master-0 kubenswrapper[3972]: I0313 10:35:23.308712 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:23.309939 master-0 kubenswrapper[3972]: E0313 10:35:23.308878 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:24.309226 master-0 kubenswrapper[3972]: I0313 10:35:24.308921 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:24.309226 master-0 kubenswrapper[3972]: E0313 10:35:24.309077 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:24.998266 master-0 kubenswrapper[3972]: I0313 10:35:24.998076 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} Mar 13 10:35:25.309425 master-0 kubenswrapper[3972]: I0313 10:35:25.309307 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:25.310050 master-0 kubenswrapper[3972]: E0313 10:35:25.309450 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:26.309453 master-0 kubenswrapper[3972]: I0313 10:35:26.309402 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:26.310238 master-0 kubenswrapper[3972]: E0313 10:35:26.309614 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:27.309034 master-0 kubenswrapper[3972]: I0313 10:35:27.308970 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:27.309432 master-0 kubenswrapper[3972]: E0313 10:35:27.309146 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:28.263778 master-0 kubenswrapper[3972]: E0313 10:35:28.263367 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:28.320600 master-0 kubenswrapper[3972]: I0313 10:35:28.320527 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:28.320806 master-0 kubenswrapper[3972]: E0313 10:35:28.320755 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:29.308585 master-0 kubenswrapper[3972]: I0313 10:35:29.308510 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:29.309328 master-0 kubenswrapper[3972]: E0313 10:35:29.308650 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:30.019469 master-0 kubenswrapper[3972]: I0313 10:35:30.019347 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerStarted","Data":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} Mar 13 10:35:30.019884 master-0 kubenswrapper[3972]: I0313 10:35:30.019830 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:30.051153 master-0 kubenswrapper[3972]: I0313 10:35:30.051001 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:30.309758 master-0 kubenswrapper[3972]: I0313 10:35:30.309568 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:30.309758 master-0 kubenswrapper[3972]: E0313 10:35:30.309727 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:31.022179 master-0 kubenswrapper[3972]: I0313 10:35:31.022100 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:31.022179 master-0 kubenswrapper[3972]: I0313 10:35:31.022155 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:31.051075 master-0 kubenswrapper[3972]: I0313 10:35:31.051003 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:31.315803 master-0 kubenswrapper[3972]: I0313 10:35:31.315603 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:31.315803 master-0 kubenswrapper[3972]: E0313 10:35:31.315790 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:31.391974 master-0 kubenswrapper[3972]: I0313 10:35:31.391920 3972 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2q2tp"] Mar 13 10:35:31.432548 master-0 kubenswrapper[3972]: I0313 10:35:31.432446 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podStartSLOduration=13.047809855 podStartE2EDuration="29.432417145s" podCreationTimestamp="2026-03-13 10:35:02 +0000 UTC" firstStartedPulling="2026-03-13 10:35:04.998417123 +0000 UTC m=+107.616533521" lastFinishedPulling="2026-03-13 10:35:21.383024433 +0000 UTC m=+124.001140811" observedRunningTime="2026-03-13 10:35:31.43139903 +0000 UTC m=+134.049515478" watchObservedRunningTime="2026-03-13 10:35:31.432417145 +0000 UTC m=+134.050533543" Mar 13 10:35:32.029137 master-0 kubenswrapper[3972]: I0313 10:35:32.028765 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c08b2c581358381ac2f0c793ddf6295e272c0061c1b2d6e05d6e5ab7c2a5729b" exitCode=0 Mar 13 10:35:32.029389 master-0 kubenswrapper[3972]: I0313 10:35:32.029002 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"c08b2c581358381ac2f0c793ddf6295e272c0061c1b2d6e05d6e5ab7c2a5729b"} Mar 13 10:35:32.309067 master-0 kubenswrapper[3972]: I0313 10:35:32.308979 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:32.309274 master-0 kubenswrapper[3972]: E0313 10:35:32.309201 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:32.944981 master-0 kubenswrapper[3972]: I0313 10:35:32.939021 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jwfjl"] Mar 13 10:35:32.944981 master-0 kubenswrapper[3972]: I0313 10:35:32.939314 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:32.944981 master-0 kubenswrapper[3972]: I0313 10:35:32.939443 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c5vhc"] Mar 13 10:35:32.944981 master-0 kubenswrapper[3972]: E0313 10:35:32.939517 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:33.039149 master-0 kubenswrapper[3972]: I0313 10:35:33.038839 3972 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ce2b6ceda0b8c8212b1e35589d611accb6e40391c87b39cfb64f98a22b7e5dda" exitCode=0 Mar 13 10:35:33.039149 master-0 kubenswrapper[3972]: I0313 10:35:33.038935 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:33.039149 master-0 kubenswrapper[3972]: I0313 10:35:33.038924 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"ce2b6ceda0b8c8212b1e35589d611accb6e40391c87b39cfb64f98a22b7e5dda"} Mar 13 10:35:33.039149 master-0 kubenswrapper[3972]: E0313 10:35:33.039126 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.039967 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="nbdb" containerID="cri-o://2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.040012 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="northd" containerID="cri-o://aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.040021 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-node" containerID="cri-o://86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.039968 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.040137 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-acl-logging" containerID="cri-o://2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.039967 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-controller" containerID="cri-o://7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" gracePeriod=30 Mar 13 10:35:33.040367 master-0 kubenswrapper[3972]: I0313 10:35:33.040218 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="sbdb" containerID="cri-o://2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" gracePeriod=30 Mar 13 10:35:33.078894 master-0 kubenswrapper[3972]: I0313 10:35:33.078825 3972 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovnkube-controller" containerID="cri-o://437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" gracePeriod=30 Mar 13 10:35:33.264679 master-0 kubenswrapper[3972]: E0313 10:35:33.264637 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:33.312550 master-0 kubenswrapper[3972]: I0313 10:35:33.312475 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovnkube-controller/0.log" Mar 13 10:35:33.315026 master-0 kubenswrapper[3972]: I0313 10:35:33.315013 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/kube-rbac-proxy-ovn-metrics/0.log" Mar 13 10:35:33.315669 master-0 kubenswrapper[3972]: I0313 10:35:33.315641 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/kube-rbac-proxy-node/0.log" Mar 13 10:35:33.316286 master-0 kubenswrapper[3972]: I0313 10:35:33.316236 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovn-acl-logging/0.log" Mar 13 10:35:33.316782 master-0 kubenswrapper[3972]: I0313 10:35:33.316722 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovn-controller/0.log" Mar 13 10:35:33.317356 master-0 kubenswrapper[3972]: I0313 10:35:33.317147 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390365 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-vww4t"] Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390519 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390556 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390568 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="sbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390577 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="sbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390587 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-node" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390596 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-node" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390605 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="northd" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390613 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="northd" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390622 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-acl-logging" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390632 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-acl-logging" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390641 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="nbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390649 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="nbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390658 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390666 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390675 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kubecfg-setup" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390683 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kubecfg-setup" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: E0313 10:35:33.390691 3972 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovnkube-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390699 3972 state_mem.go:107] "Deleted CPUSet assignment" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovnkube-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390761 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="northd" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390778 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="nbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390787 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390795 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovn-acl-logging" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390804 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-ovn-metrics" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390812 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="ovnkube-controller" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390820 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="kube-rbac-proxy-node" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.390829 3972 memory_manager.go:354] "RemoveStaleState removing state" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" containerName="sbdb" Mar 13 10:35:33.395259 master-0 kubenswrapper[3972]: I0313 10:35:33.391735 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.484935 master-0 kubenswrapper[3972]: I0313 10:35:33.484862 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485148 master-0 kubenswrapper[3972]: I0313 10:35:33.484954 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485148 master-0 kubenswrapper[3972]: I0313 10:35:33.484983 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.485148 master-0 kubenswrapper[3972]: I0313 10:35:33.485049 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485148 master-0 kubenswrapper[3972]: I0313 10:35:33.485127 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.485302 master-0 kubenswrapper[3972]: I0313 10:35:33.485192 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485302 master-0 kubenswrapper[3972]: I0313 10:35:33.485221 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.485350 master-0 kubenswrapper[3972]: I0313 10:35:33.485334 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485558 master-0 kubenswrapper[3972]: I0313 10:35:33.485496 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485558 master-0 kubenswrapper[3972]: I0313 10:35:33.485532 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket" (OuterVolumeSpecName: "log-socket") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.485648 master-0 kubenswrapper[3972]: I0313 10:35:33.485619 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485676 master-0 kubenswrapper[3972]: I0313 10:35:33.485648 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485676 master-0 kubenswrapper[3972]: I0313 10:35:33.485665 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485722 master-0 kubenswrapper[3972]: I0313 10:35:33.485687 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-klpg7\" (UniqueName: \"kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485722 master-0 kubenswrapper[3972]: I0313 10:35:33.485702 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485722 master-0 kubenswrapper[3972]: I0313 10:35:33.485718 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485803 master-0 kubenswrapper[3972]: I0313 10:35:33.485735 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485803 master-0 kubenswrapper[3972]: I0313 10:35:33.485750 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485803 master-0 kubenswrapper[3972]: I0313 10:35:33.485764 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485803 master-0 kubenswrapper[3972]: I0313 10:35:33.485782 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485803 master-0 kubenswrapper[3972]: I0313 10:35:33.485797 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485926 master-0 kubenswrapper[3972]: I0313 10:35:33.485812 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485926 master-0 kubenswrapper[3972]: I0313 10:35:33.485829 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485926 master-0 kubenswrapper[3972]: I0313 10:35:33.485844 3972 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch\") pod \"bfb154e7-a689-4694-a500-cb76a91d924f\" (UID: \"bfb154e7-a689-4694-a500-cb76a91d924f\") " Mar 13 10:35:33.485926 master-0 kubenswrapper[3972]: I0313 10:35:33.485920 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486056 master-0 kubenswrapper[3972]: I0313 10:35:33.485972 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486056 master-0 kubenswrapper[3972]: I0313 10:35:33.485989 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486056 master-0 kubenswrapper[3972]: I0313 10:35:33.486030 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486155 master-0 kubenswrapper[3972]: I0313 10:35:33.486060 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486155 master-0 kubenswrapper[3972]: I0313 10:35:33.486081 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486155 master-0 kubenswrapper[3972]: I0313 10:35:33.486114 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486245 master-0 kubenswrapper[3972]: I0313 10:35:33.486170 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486245 master-0 kubenswrapper[3972]: I0313 10:35:33.486201 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486245 master-0 kubenswrapper[3972]: I0313 10:35:33.486226 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486320 master-0 kubenswrapper[3972]: I0313 10:35:33.486246 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486320 master-0 kubenswrapper[3972]: I0313 10:35:33.486268 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486320 master-0 kubenswrapper[3972]: I0313 10:35:33.486287 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486320 master-0 kubenswrapper[3972]: I0313 10:35:33.486307 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486412 master-0 kubenswrapper[3972]: I0313 10:35:33.486327 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486412 master-0 kubenswrapper[3972]: I0313 10:35:33.486352 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486412 master-0 kubenswrapper[3972]: I0313 10:35:33.486372 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486412 master-0 kubenswrapper[3972]: I0313 10:35:33.486394 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486412 master-0 kubenswrapper[3972]: I0313 10:35:33.486411 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485687 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485741 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485810 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485814 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485873 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485905 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log" (OuterVolumeSpecName: "node-log") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.485968 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.486377 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.486402 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash" (OuterVolumeSpecName: "host-slash") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.486441 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.486575 master-0 kubenswrapper[3972]: I0313 10:35:33.486543 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486870 master-0 kubenswrapper[3972]: I0313 10:35:33.486588 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486870 master-0 kubenswrapper[3972]: I0313 10:35:33.486618 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.486870 master-0 kubenswrapper[3972]: I0313 10:35:33.486690 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:35:33.486870 master-0 kubenswrapper[3972]: I0313 10:35:33.486825 3972 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.486870 master-0 kubenswrapper[3972]: I0313 10:35:33.486848 3972 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.487054 master-0 kubenswrapper[3972]: I0313 10:35:33.486967 3972 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.487054 master-0 kubenswrapper[3972]: I0313 10:35:33.487008 3972 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.494170 master-0 kubenswrapper[3972]: I0313 10:35:33.494040 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7" (OuterVolumeSpecName: "kube-api-access-klpg7") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "kube-api-access-klpg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:35:33.494565 master-0 kubenswrapper[3972]: I0313 10:35:33.494443 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:35:33.495606 master-0 kubenswrapper[3972]: I0313 10:35:33.495560 3972 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "bfb154e7-a689-4694-a500-cb76a91d924f" (UID: "bfb154e7-a689-4694-a500-cb76a91d924f"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:35:33.588085 master-0 kubenswrapper[3972]: I0313 10:35:33.587879 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588085 master-0 kubenswrapper[3972]: I0313 10:35:33.587964 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588082 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588180 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588197 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588224 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588238 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588260 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588571 master-0 kubenswrapper[3972]: I0313 10:35:33.588431 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588577 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588664 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588739 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588739 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588761 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588789 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588814 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588846 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588828 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588890 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588923 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588926 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588958 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.588984 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.588989 master-0 kubenswrapper[3972]: I0313 10:35:33.589005 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589030 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589053 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589078 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589113 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589124 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589200 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589304 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589367 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589400 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589422 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589473 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589492 3972 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589523 3972 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589532 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589551 3972 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589573 3972 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/bfb154e7-a689-4694-a500-cb76a91d924f-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589591 3972 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589610 3972 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589627 3972 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.589974 master-0 kubenswrapper[3972]: I0313 10:35:33.589644 3972 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589661 3972 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-klpg7\" (UniqueName: \"kubernetes.io/projected/bfb154e7-a689-4694-a500-cb76a91d924f-kube-api-access-klpg7\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589678 3972 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-node-log\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589694 3972 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589713 3972 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589730 3972 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589746 3972 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589764 3972 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/bfb154e7-a689-4694-a500-cb76a91d924f-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.589781 3972 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/bfb154e7-a689-4694-a500-cb76a91d924f-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.590034 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.591651 master-0 kubenswrapper[3972]: I0313 10:35:33.590374 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.593044 master-0 kubenswrapper[3972]: I0313 10:35:33.591977 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.616709 master-0 kubenswrapper[3972]: I0313 10:35:33.616595 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.711184 master-0 kubenswrapper[3972]: I0313 10:35:33.711049 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:33.732453 master-0 kubenswrapper[3972]: W0313 10:35:33.732374 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb060653_0d4b_4759_a7a1_c5dce194cce7.slice/crio-0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1 WatchSource:0}: Error finding container 0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1: Status 404 returned error can't find the container with id 0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1 Mar 13 10:35:34.052281 master-0 kubenswrapper[3972]: I0313 10:35:34.051892 3972 generic.go:334] "Generic (PLEG): container finished" podID="fb060653-0d4b-4759-a7a1-c5dce194cce7" containerID="f741ec84eccfaea3008e82066654cae2f174abb120ece50ffb0345c3a6b62422" exitCode=0 Mar 13 10:35:34.052281 master-0 kubenswrapper[3972]: I0313 10:35:34.051971 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerDied","Data":"f741ec84eccfaea3008e82066654cae2f174abb120ece50ffb0345c3a6b62422"} Mar 13 10:35:34.052281 master-0 kubenswrapper[3972]: I0313 10:35:34.052301 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1"} Mar 13 10:35:34.056844 master-0 kubenswrapper[3972]: I0313 10:35:34.056764 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovnkube-controller/0.log" Mar 13 10:35:34.065757 master-0 kubenswrapper[3972]: I0313 10:35:34.065709 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/kube-rbac-proxy-ovn-metrics/0.log" Mar 13 10:35:34.066420 master-0 kubenswrapper[3972]: I0313 10:35:34.066349 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/kube-rbac-proxy-node/0.log" Mar 13 10:35:34.067089 master-0 kubenswrapper[3972]: I0313 10:35:34.067053 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovn-acl-logging/0.log" Mar 13 10:35:34.067564 master-0 kubenswrapper[3972]: I0313 10:35:34.067522 3972 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2q2tp_bfb154e7-a689-4694-a500-cb76a91d924f/ovn-controller/0.log" Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067941 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" exitCode=2 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067963 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" exitCode=0 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067971 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" exitCode=0 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067979 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" exitCode=0 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067985 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" exitCode=143 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067992 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" exitCode=143 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.067999 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" exitCode=143 Mar 13 10:35:34.067973 master-0 kubenswrapper[3972]: I0313 10:35:34.068005 3972 generic.go:334] "Generic (PLEG): container finished" podID="bfb154e7-a689-4694-a500-cb76a91d924f" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" exitCode=143 Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068057 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068072 3972 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068084 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068114 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068124 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068133 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068143 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068154 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068300 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068306 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068314 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068323 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068329 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068334 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068339 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068344 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068349 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068353 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068358 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068363 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068371 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068379 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068386 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068391 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068396 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} Mar 13 10:35:34.068577 master-0 kubenswrapper[3972]: I0313 10:35:34.068401 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068406 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068410 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068415 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068424 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068431 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2q2tp" event={"ID":"bfb154e7-a689-4694-a500-cb76a91d924f","Type":"ContainerDied","Data":"e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068440 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068446 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068452 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068459 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068465 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068470 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068474 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068479 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068483 3972 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} Mar 13 10:35:34.070542 master-0 kubenswrapper[3972]: I0313 10:35:34.068522 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.091223 master-0 kubenswrapper[3972]: I0313 10:35:34.090721 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerStarted","Data":"7cfa3dc4e8621eea443d54a6af5854af2a55bace9bce4224ea7b93f4c1da9807"} Mar 13 10:35:34.096133 master-0 kubenswrapper[3972]: I0313 10:35:34.094658 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:35:34.096133 master-0 kubenswrapper[3972]: E0313 10:35:34.094749 3972 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:35:34.096133 master-0 kubenswrapper[3972]: E0313 10:35:34.094807 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:38.094788343 +0000 UTC m=+200.712904731 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:35:34.103162 master-0 kubenswrapper[3972]: I0313 10:35:34.103128 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.116531 master-0 kubenswrapper[3972]: I0313 10:35:34.116270 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.134382 master-0 kubenswrapper[3972]: I0313 10:35:34.133562 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-72t2n" podStartSLOduration=3.917384807 podStartE2EDuration="45.133533363s" podCreationTimestamp="2026-03-13 10:34:49 +0000 UTC" firstStartedPulling="2026-03-13 10:34:50.239402203 +0000 UTC m=+92.857518631" lastFinishedPulling="2026-03-13 10:35:31.455550799 +0000 UTC m=+134.073667187" observedRunningTime="2026-03-13 10:35:34.116294694 +0000 UTC m=+136.734411152" watchObservedRunningTime="2026-03-13 10:35:34.133533363 +0000 UTC m=+136.751649761" Mar 13 10:35:34.134382 master-0 kubenswrapper[3972]: I0313 10:35:34.134316 3972 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2q2tp"] Mar 13 10:35:34.147857 master-0 kubenswrapper[3972]: I0313 10:35:34.147628 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.149692 master-0 kubenswrapper[3972]: I0313 10:35:34.149625 3972 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-2q2tp"] Mar 13 10:35:34.159064 master-0 kubenswrapper[3972]: I0313 10:35:34.159019 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.170137 master-0 kubenswrapper[3972]: I0313 10:35:34.170054 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.212496 master-0 kubenswrapper[3972]: I0313 10:35:34.212253 3972 scope.go:117] "RemoveContainer" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.220720 master-0 kubenswrapper[3972]: I0313 10:35:34.220625 3972 scope.go:117] "RemoveContainer" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.236539 master-0 kubenswrapper[3972]: I0313 10:35:34.236503 3972 scope.go:117] "RemoveContainer" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.247714 master-0 kubenswrapper[3972]: I0313 10:35:34.247676 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.248176 master-0 kubenswrapper[3972]: E0313 10:35:34.248150 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.248233 master-0 kubenswrapper[3972]: I0313 10:35:34.248195 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} err="failed to get container status \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" Mar 13 10:35:34.248233 master-0 kubenswrapper[3972]: I0313 10:35:34.248219 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.248750 master-0 kubenswrapper[3972]: E0313 10:35:34.248709 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.248810 master-0 kubenswrapper[3972]: I0313 10:35:34.248756 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} err="failed to get container status \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" Mar 13 10:35:34.248810 master-0 kubenswrapper[3972]: I0313 10:35:34.248790 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.249176 master-0 kubenswrapper[3972]: E0313 10:35:34.249138 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.249236 master-0 kubenswrapper[3972]: I0313 10:35:34.249176 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} err="failed to get container status \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" Mar 13 10:35:34.249236 master-0 kubenswrapper[3972]: I0313 10:35:34.249221 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.249537 master-0 kubenswrapper[3972]: E0313 10:35:34.249517 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.249589 master-0 kubenswrapper[3972]: I0313 10:35:34.249542 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} err="failed to get container status \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" Mar 13 10:35:34.249589 master-0 kubenswrapper[3972]: I0313 10:35:34.249556 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.249831 master-0 kubenswrapper[3972]: E0313 10:35:34.249809 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.249943 master-0 kubenswrapper[3972]: I0313 10:35:34.249834 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} err="failed to get container status \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" Mar 13 10:35:34.249976 master-0 kubenswrapper[3972]: I0313 10:35:34.249946 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.250184 master-0 kubenswrapper[3972]: E0313 10:35:34.250166 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.250236 master-0 kubenswrapper[3972]: I0313 10:35:34.250186 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} err="failed to get container status \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" Mar 13 10:35:34.250236 master-0 kubenswrapper[3972]: I0313 10:35:34.250199 3972 scope.go:117] "RemoveContainer" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.250512 master-0 kubenswrapper[3972]: E0313 10:35:34.250493 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": container with ID starting with 2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925 not found: ID does not exist" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.250564 master-0 kubenswrapper[3972]: I0313 10:35:34.250512 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} err="failed to get container status \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": rpc error: code = NotFound desc = could not find container \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": container with ID starting with 2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925 not found: ID does not exist" Mar 13 10:35:34.250564 master-0 kubenswrapper[3972]: I0313 10:35:34.250527 3972 scope.go:117] "RemoveContainer" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.251006 master-0 kubenswrapper[3972]: E0313 10:35:34.250988 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": container with ID starting with 7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2 not found: ID does not exist" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.251059 master-0 kubenswrapper[3972]: I0313 10:35:34.251008 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} err="failed to get container status \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": rpc error: code = NotFound desc = could not find container \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": container with ID starting with 7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2 not found: ID does not exist" Mar 13 10:35:34.251059 master-0 kubenswrapper[3972]: I0313 10:35:34.251021 3972 scope.go:117] "RemoveContainer" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.251316 master-0 kubenswrapper[3972]: E0313 10:35:34.251284 3972 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": container with ID starting with 9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51 not found: ID does not exist" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.251374 master-0 kubenswrapper[3972]: I0313 10:35:34.251315 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} err="failed to get container status \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": rpc error: code = NotFound desc = could not find container \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": container with ID starting with 9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51 not found: ID does not exist" Mar 13 10:35:34.251374 master-0 kubenswrapper[3972]: I0313 10:35:34.251329 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.251540 master-0 kubenswrapper[3972]: I0313 10:35:34.251520 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} err="failed to get container status \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" Mar 13 10:35:34.251599 master-0 kubenswrapper[3972]: I0313 10:35:34.251541 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.251889 master-0 kubenswrapper[3972]: I0313 10:35:34.251806 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} err="failed to get container status \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" Mar 13 10:35:34.251889 master-0 kubenswrapper[3972]: I0313 10:35:34.251825 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.252354 master-0 kubenswrapper[3972]: I0313 10:35:34.252128 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} err="failed to get container status \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" Mar 13 10:35:34.252354 master-0 kubenswrapper[3972]: I0313 10:35:34.252148 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.252441 master-0 kubenswrapper[3972]: I0313 10:35:34.252394 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} err="failed to get container status \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" Mar 13 10:35:34.252441 master-0 kubenswrapper[3972]: I0313 10:35:34.252420 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.252809 master-0 kubenswrapper[3972]: I0313 10:35:34.252717 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} err="failed to get container status \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" Mar 13 10:35:34.252809 master-0 kubenswrapper[3972]: I0313 10:35:34.252747 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.253022 master-0 kubenswrapper[3972]: I0313 10:35:34.252989 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} err="failed to get container status \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" Mar 13 10:35:34.253022 master-0 kubenswrapper[3972]: I0313 10:35:34.253012 3972 scope.go:117] "RemoveContainer" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.253290 master-0 kubenswrapper[3972]: I0313 10:35:34.253216 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} err="failed to get container status \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": rpc error: code = NotFound desc = could not find container \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": container with ID starting with 2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925 not found: ID does not exist" Mar 13 10:35:34.253290 master-0 kubenswrapper[3972]: I0313 10:35:34.253235 3972 scope.go:117] "RemoveContainer" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.253473 master-0 kubenswrapper[3972]: I0313 10:35:34.253448 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} err="failed to get container status \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": rpc error: code = NotFound desc = could not find container \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": container with ID starting with 7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2 not found: ID does not exist" Mar 13 10:35:34.253521 master-0 kubenswrapper[3972]: I0313 10:35:34.253475 3972 scope.go:117] "RemoveContainer" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.253717 master-0 kubenswrapper[3972]: I0313 10:35:34.253696 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} err="failed to get container status \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": rpc error: code = NotFound desc = could not find container \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": container with ID starting with 9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51 not found: ID does not exist" Mar 13 10:35:34.253761 master-0 kubenswrapper[3972]: I0313 10:35:34.253715 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.253952 master-0 kubenswrapper[3972]: I0313 10:35:34.253931 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} err="failed to get container status \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" Mar 13 10:35:34.253952 master-0 kubenswrapper[3972]: I0313 10:35:34.253949 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.254234 master-0 kubenswrapper[3972]: I0313 10:35:34.254156 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} err="failed to get container status \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" Mar 13 10:35:34.254234 master-0 kubenswrapper[3972]: I0313 10:35:34.254176 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.254445 master-0 kubenswrapper[3972]: I0313 10:35:34.254418 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} err="failed to get container status \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" Mar 13 10:35:34.254445 master-0 kubenswrapper[3972]: I0313 10:35:34.254443 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.254668 master-0 kubenswrapper[3972]: I0313 10:35:34.254645 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} err="failed to get container status \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" Mar 13 10:35:34.254668 master-0 kubenswrapper[3972]: I0313 10:35:34.254664 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.254970 master-0 kubenswrapper[3972]: I0313 10:35:34.254877 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} err="failed to get container status \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" Mar 13 10:35:34.254970 master-0 kubenswrapper[3972]: I0313 10:35:34.254901 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.255209 master-0 kubenswrapper[3972]: I0313 10:35:34.255188 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} err="failed to get container status \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" Mar 13 10:35:34.255209 master-0 kubenswrapper[3972]: I0313 10:35:34.255207 3972 scope.go:117] "RemoveContainer" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.255486 master-0 kubenswrapper[3972]: I0313 10:35:34.255463 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} err="failed to get container status \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": rpc error: code = NotFound desc = could not find container \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": container with ID starting with 2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925 not found: ID does not exist" Mar 13 10:35:34.255486 master-0 kubenswrapper[3972]: I0313 10:35:34.255481 3972 scope.go:117] "RemoveContainer" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.255750 master-0 kubenswrapper[3972]: I0313 10:35:34.255726 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} err="failed to get container status \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": rpc error: code = NotFound desc = could not find container \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": container with ID starting with 7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2 not found: ID does not exist" Mar 13 10:35:34.255813 master-0 kubenswrapper[3972]: I0313 10:35:34.255751 3972 scope.go:117] "RemoveContainer" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.256051 master-0 kubenswrapper[3972]: I0313 10:35:34.256030 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} err="failed to get container status \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": rpc error: code = NotFound desc = could not find container \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": container with ID starting with 9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51 not found: ID does not exist" Mar 13 10:35:34.256051 master-0 kubenswrapper[3972]: I0313 10:35:34.256046 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.256332 master-0 kubenswrapper[3972]: I0313 10:35:34.256312 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} err="failed to get container status \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" Mar 13 10:35:34.256332 master-0 kubenswrapper[3972]: I0313 10:35:34.256330 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.256561 master-0 kubenswrapper[3972]: I0313 10:35:34.256533 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} err="failed to get container status \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" Mar 13 10:35:34.256561 master-0 kubenswrapper[3972]: I0313 10:35:34.256547 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.257066 master-0 kubenswrapper[3972]: I0313 10:35:34.257040 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} err="failed to get container status \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" Mar 13 10:35:34.257066 master-0 kubenswrapper[3972]: I0313 10:35:34.257065 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.257400 master-0 kubenswrapper[3972]: I0313 10:35:34.257368 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} err="failed to get container status \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" Mar 13 10:35:34.257476 master-0 kubenswrapper[3972]: I0313 10:35:34.257401 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.257766 master-0 kubenswrapper[3972]: I0313 10:35:34.257687 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} err="failed to get container status \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" Mar 13 10:35:34.257766 master-0 kubenswrapper[3972]: I0313 10:35:34.257708 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.258165 master-0 kubenswrapper[3972]: I0313 10:35:34.258130 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} err="failed to get container status \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" Mar 13 10:35:34.258165 master-0 kubenswrapper[3972]: I0313 10:35:34.258159 3972 scope.go:117] "RemoveContainer" containerID="2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925" Mar 13 10:35:34.258511 master-0 kubenswrapper[3972]: I0313 10:35:34.258484 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925"} err="failed to get container status \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": rpc error: code = NotFound desc = could not find container \"2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925\": container with ID starting with 2aa00fac663f1d988096cc896791a6e4bc4a5554778a7784afa91a78df19f925 not found: ID does not exist" Mar 13 10:35:34.258568 master-0 kubenswrapper[3972]: I0313 10:35:34.258512 3972 scope.go:117] "RemoveContainer" containerID="7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2" Mar 13 10:35:34.258765 master-0 kubenswrapper[3972]: I0313 10:35:34.258747 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2"} err="failed to get container status \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": rpc error: code = NotFound desc = could not find container \"7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2\": container with ID starting with 7d84978e35b8390b137cf6c321fe7488f135a0a212ea874a208dbf15f18294b2 not found: ID does not exist" Mar 13 10:35:34.258765 master-0 kubenswrapper[3972]: I0313 10:35:34.258764 3972 scope.go:117] "RemoveContainer" containerID="9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51" Mar 13 10:35:34.259140 master-0 kubenswrapper[3972]: I0313 10:35:34.259037 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51"} err="failed to get container status \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": rpc error: code = NotFound desc = could not find container \"9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51\": container with ID starting with 9d3894a3d4cb48bfe4ca027a1ab8b131a79c108077f77579d5995dc2842ada51 not found: ID does not exist" Mar 13 10:35:34.259140 master-0 kubenswrapper[3972]: I0313 10:35:34.259056 3972 scope.go:117] "RemoveContainer" containerID="437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45" Mar 13 10:35:34.259299 master-0 kubenswrapper[3972]: I0313 10:35:34.259279 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45"} err="failed to get container status \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": rpc error: code = NotFound desc = could not find container \"437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45\": container with ID starting with 437ba345b36302db117155708d9bef88002731eb025951caf3192e9caee41a45 not found: ID does not exist" Mar 13 10:35:34.259299 master-0 kubenswrapper[3972]: I0313 10:35:34.259297 3972 scope.go:117] "RemoveContainer" containerID="2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec" Mar 13 10:35:34.259502 master-0 kubenswrapper[3972]: I0313 10:35:34.259465 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec"} err="failed to get container status \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": rpc error: code = NotFound desc = could not find container \"2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec\": container with ID starting with 2baff9b5ab29f26a9f2d7d582b6cc681c49145630c0bf5c49d47a543b97d00ec not found: ID does not exist" Mar 13 10:35:34.259502 master-0 kubenswrapper[3972]: I0313 10:35:34.259490 3972 scope.go:117] "RemoveContainer" containerID="2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd" Mar 13 10:35:34.259772 master-0 kubenswrapper[3972]: I0313 10:35:34.259680 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd"} err="failed to get container status \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": rpc error: code = NotFound desc = could not find container \"2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd\": container with ID starting with 2ec3e1e90efe1aac789b109f41cac3da1c2e0941f15ab8ce65775861bd9baedd not found: ID does not exist" Mar 13 10:35:34.259772 master-0 kubenswrapper[3972]: I0313 10:35:34.259697 3972 scope.go:117] "RemoveContainer" containerID="aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2" Mar 13 10:35:34.259956 master-0 kubenswrapper[3972]: I0313 10:35:34.259923 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2"} err="failed to get container status \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": rpc error: code = NotFound desc = could not find container \"aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2\": container with ID starting with aa5a40cf8accb31b0594809917d3ca95d6995e3545570203023eef696733f5f2 not found: ID does not exist" Mar 13 10:35:34.260020 master-0 kubenswrapper[3972]: I0313 10:35:34.259960 3972 scope.go:117] "RemoveContainer" containerID="e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1" Mar 13 10:35:34.260288 master-0 kubenswrapper[3972]: I0313 10:35:34.260261 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1"} err="failed to get container status \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": rpc error: code = NotFound desc = could not find container \"e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1\": container with ID starting with e97d660fa2f22996b31637092b13e3ac0790b1769a173c056ef010c048c28ee1 not found: ID does not exist" Mar 13 10:35:34.260352 master-0 kubenswrapper[3972]: I0313 10:35:34.260288 3972 scope.go:117] "RemoveContainer" containerID="86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36" Mar 13 10:35:34.260583 master-0 kubenswrapper[3972]: I0313 10:35:34.260561 3972 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36"} err="failed to get container status \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": rpc error: code = NotFound desc = could not find container \"86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36\": container with ID starting with 86eee422a4f74e9e72d0d5cc74bb662653c11a88d11d87293d7bb9114b713e36 not found: ID does not exist" Mar 13 10:35:34.309736 master-0 kubenswrapper[3972]: I0313 10:35:34.309572 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:34.309969 master-0 kubenswrapper[3972]: E0313 10:35:34.309855 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:34.317373 master-0 kubenswrapper[3972]: I0313 10:35:34.315757 3972 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bfb154e7-a689-4694-a500-cb76a91d924f" path="/var/lib/kubelet/pods/bfb154e7-a689-4694-a500-cb76a91d924f/volumes" Mar 13 10:35:35.102959 master-0 kubenswrapper[3972]: I0313 10:35:35.102872 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"63231614ec22d4454ef35c7c5f658a2bf8feeb9e3992ebcd24e450ba7c030a73"} Mar 13 10:35:35.102959 master-0 kubenswrapper[3972]: I0313 10:35:35.102949 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"3e57f0c206254ea3660894f2b43e0962df459b11648d6e3f38e8d9b4b235affb"} Mar 13 10:35:35.102959 master-0 kubenswrapper[3972]: I0313 10:35:35.102969 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"faac5e329585d430afa85413196ec70d876c8e306f516444729861dbf4543445"} Mar 13 10:35:35.104288 master-0 kubenswrapper[3972]: I0313 10:35:35.102986 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"fe506e16bf970e0974dba16aaea6afa314d9c57d8900b0995c3107b8a4cb3261"} Mar 13 10:35:35.104288 master-0 kubenswrapper[3972]: I0313 10:35:35.103003 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"06939486a208e579a527e3fd963447fe09a8c883f53d1384dddfe29aa63e21b3"} Mar 13 10:35:35.104288 master-0 kubenswrapper[3972]: I0313 10:35:35.103020 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"eb29cfca46e834bca84bd552c0afc75c789697afff5f022856c3a37252a97f97"} Mar 13 10:35:35.309249 master-0 kubenswrapper[3972]: I0313 10:35:35.308832 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:35.309249 master-0 kubenswrapper[3972]: E0313 10:35:35.309025 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:36.309794 master-0 kubenswrapper[3972]: I0313 10:35:36.309705 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:36.310417 master-0 kubenswrapper[3972]: E0313 10:35:36.309902 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:37.309510 master-0 kubenswrapper[3972]: I0313 10:35:37.309428 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:37.309824 master-0 kubenswrapper[3972]: E0313 10:35:37.309608 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:37.438513 master-0 kubenswrapper[3972]: I0313 10:35:37.438179 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:37.438716 master-0 kubenswrapper[3972]: E0313 10:35:37.438385 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 13 10:35:37.438716 master-0 kubenswrapper[3972]: E0313 10:35:37.438575 3972 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 13 10:35:37.438716 master-0 kubenswrapper[3972]: E0313 10:35:37.438592 3972 projected.go:194] Error preparing data for projected volume kube-api-access-zltcf for pod openshift-network-diagnostics/network-check-target-jwfjl: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:37.438716 master-0 kubenswrapper[3972]: E0313 10:35:37.438649 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf podName:a7b698d2-f23a-4404-bc63-757ca549356f nodeName:}" failed. No retries permitted until 2026-03-13 10:36:09.438632965 +0000 UTC m=+172.056749353 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-zltcf" (UniqueName: "kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf") pod "network-check-target-jwfjl" (UID: "a7b698d2-f23a-4404-bc63-757ca549356f") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 13 10:35:38.124492 master-0 kubenswrapper[3972]: I0313 10:35:38.124405 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"0f5ebe341252592ddb7ea07f27157b630ec6d6698481394ac477334f68310522"} Mar 13 10:35:38.266322 master-0 kubenswrapper[3972]: E0313 10:35:38.266223 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:38.308644 master-0 kubenswrapper[3972]: I0313 10:35:38.308532 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:38.310235 master-0 kubenswrapper[3972]: E0313 10:35:38.310145 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:39.309419 master-0 kubenswrapper[3972]: I0313 10:35:39.309311 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:39.309739 master-0 kubenswrapper[3972]: E0313 10:35:39.309468 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:40.137579 master-0 kubenswrapper[3972]: I0313 10:35:40.137453 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"b650c74f8b57c73e892b63268846a8c6d8dd851805ffc652eb497ec8ad4cfef2"} Mar 13 10:35:40.139233 master-0 kubenswrapper[3972]: I0313 10:35:40.139188 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:40.139318 master-0 kubenswrapper[3972]: I0313 10:35:40.139246 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:40.139430 master-0 kubenswrapper[3972]: I0313 10:35:40.139337 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:40.199070 master-0 kubenswrapper[3972]: I0313 10:35:40.198911 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:40.201677 master-0 kubenswrapper[3972]: I0313 10:35:40.201496 3972 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:35:40.222305 master-0 kubenswrapper[3972]: I0313 10:35:40.222212 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" podStartSLOduration=7.22218643 podStartE2EDuration="7.22218643s" podCreationTimestamp="2026-03-13 10:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:35:40.193914752 +0000 UTC m=+142.812031140" watchObservedRunningTime="2026-03-13 10:35:40.22218643 +0000 UTC m=+142.840302818" Mar 13 10:35:40.310077 master-0 kubenswrapper[3972]: I0313 10:35:40.309730 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:40.310403 master-0 kubenswrapper[3972]: E0313 10:35:40.310186 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:41.308772 master-0 kubenswrapper[3972]: I0313 10:35:41.308601 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:41.310477 master-0 kubenswrapper[3972]: E0313 10:35:41.308869 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:42.309948 master-0 kubenswrapper[3972]: I0313 10:35:42.309402 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:42.311367 master-0 kubenswrapper[3972]: E0313 10:35:42.310219 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:43.268949 master-0 kubenswrapper[3972]: E0313 10:35:43.268798 3972 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 13 10:35:43.308958 master-0 kubenswrapper[3972]: I0313 10:35:43.308861 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:43.309238 master-0 kubenswrapper[3972]: E0313 10:35:43.309039 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:44.309816 master-0 kubenswrapper[3972]: I0313 10:35:44.309692 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:44.310855 master-0 kubenswrapper[3972]: E0313 10:35:44.309889 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:45.309314 master-0 kubenswrapper[3972]: I0313 10:35:45.309160 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:45.310164 master-0 kubenswrapper[3972]: E0313 10:35:45.309434 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:46.309476 master-0 kubenswrapper[3972]: I0313 10:35:46.309376 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:46.310497 master-0 kubenswrapper[3972]: E0313 10:35:46.309897 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c5vhc" podUID="8df2728b-4f21-4aef-b31f-4197bbcd2728" Mar 13 10:35:47.309182 master-0 kubenswrapper[3972]: I0313 10:35:47.309060 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:47.309434 master-0 kubenswrapper[3972]: E0313 10:35:47.309299 3972 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-jwfjl" podUID="a7b698d2-f23a-4404-bc63-757ca549356f" Mar 13 10:35:48.309698 master-0 kubenswrapper[3972]: I0313 10:35:48.309516 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:48.317507 master-0 kubenswrapper[3972]: I0313 10:35:48.317437 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:35:49.309528 master-0 kubenswrapper[3972]: I0313 10:35:49.309391 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:35:49.313191 master-0 kubenswrapper[3972]: I0313 10:35:49.313136 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:35:49.313579 master-0 kubenswrapper[3972]: I0313 10:35:49.313215 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:35:51.937013 master-0 kubenswrapper[3972]: I0313 10:35:51.936953 3972 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 13 10:35:52.476069 master-0 kubenswrapper[3972]: I0313 10:35:52.475976 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j"] Mar 13 10:35:52.476810 master-0 kubenswrapper[3972]: I0313 10:35:52.476764 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.478780 master-0 kubenswrapper[3972]: I0313 10:35:52.478727 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:35:52.479308 master-0 kubenswrapper[3972]: I0313 10:35:52.479275 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:35:52.480395 master-0 kubenswrapper[3972]: I0313 10:35:52.480352 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:35:52.595400 master-0 kubenswrapper[3972]: I0313 10:35:52.595261 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.595400 master-0 kubenswrapper[3972]: I0313 10:35:52.595322 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.595706 master-0 kubenswrapper[3972]: I0313 10:35:52.595465 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.601201 master-0 kubenswrapper[3972]: I0313 10:35:52.601165 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh"] Mar 13 10:35:52.601863 master-0 kubenswrapper[3972]: I0313 10:35:52.601842 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.606524 master-0 kubenswrapper[3972]: I0313 10:35:52.606461 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk"] Mar 13 10:35:52.606880 master-0 kubenswrapper[3972]: I0313 10:35:52.606851 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-4v99n"] Mar 13 10:35:52.607049 master-0 kubenswrapper[3972]: I0313 10:35:52.607013 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:35:52.607151 master-0 kubenswrapper[3972]: I0313 10:35:52.607114 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26"] Mar 13 10:35:52.607281 master-0 kubenswrapper[3972]: I0313 10:35:52.607245 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:35:52.607350 master-0 kubenswrapper[3972]: I0313 10:35:52.607321 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:35:52.607510 master-0 kubenswrapper[3972]: I0313 10:35:52.607488 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.607575 master-0 kubenswrapper[3972]: I0313 10:35:52.607517 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.607575 master-0 kubenswrapper[3972]: I0313 10:35:52.607529 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.607747 master-0 kubenswrapper[3972]: I0313 10:35:52.607725 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm"] Mar 13 10:35:52.608155 master-0 kubenswrapper[3972]: I0313 10:35:52.608126 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.609665 master-0 kubenswrapper[3972]: I0313 10:35:52.609631 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:35:52.610026 master-0 kubenswrapper[3972]: I0313 10:35:52.610009 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:35:52.610160 master-0 kubenswrapper[3972]: I0313 10:35:52.610131 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-b2ss8"] Mar 13 10:35:52.610607 master-0 kubenswrapper[3972]: I0313 10:35:52.610584 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs"] Mar 13 10:35:52.610778 master-0 kubenswrapper[3972]: I0313 10:35:52.610752 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.610869 master-0 kubenswrapper[3972]: I0313 10:35:52.610810 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9"] Mar 13 10:35:52.611017 master-0 kubenswrapper[3972]: I0313 10:35:52.610955 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.611118 master-0 kubenswrapper[3972]: I0313 10:35:52.611046 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.612340 master-0 kubenswrapper[3972]: I0313 10:35:52.612319 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf"] Mar 13 10:35:52.612804 master-0 kubenswrapper[3972]: I0313 10:35:52.612786 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.617966 master-0 kubenswrapper[3972]: I0313 10:35:52.617923 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:35:52.618126 master-0 kubenswrapper[3972]: I0313 10:35:52.617969 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:35:52.618273 master-0 kubenswrapper[3972]: I0313 10:35:52.618207 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:35:52.618273 master-0 kubenswrapper[3972]: I0313 10:35:52.618249 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:35:52.618469 master-0 kubenswrapper[3972]: I0313 10:35:52.618410 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:35:52.618882 master-0 kubenswrapper[3972]: I0313 10:35:52.618791 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:35:52.619308 master-0 kubenswrapper[3972]: I0313 10:35:52.619120 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:35:52.620347 master-0 kubenswrapper[3972]: I0313 10:35:52.619501 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn"] Mar 13 10:35:52.620347 master-0 kubenswrapper[3972]: I0313 10:35:52.619930 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.620347 master-0 kubenswrapper[3972]: I0313 10:35:52.620293 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7"] Mar 13 10:35:52.620801 master-0 kubenswrapper[3972]: I0313 10:35:52.620708 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.621122 master-0 kubenswrapper[3972]: I0313 10:35:52.621078 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:35:52.621817 master-0 kubenswrapper[3972]: I0313 10:35:52.621796 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:35:52.621976 master-0 kubenswrapper[3972]: I0313 10:35:52.621940 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8"] Mar 13 10:35:52.622226 master-0 kubenswrapper[3972]: I0313 10:35:52.622130 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:35:52.622588 master-0 kubenswrapper[3972]: I0313 10:35:52.622507 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.622896 master-0 kubenswrapper[3972]: I0313 10:35:52.622865 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr"] Mar 13 10:35:52.625536 master-0 kubenswrapper[3972]: I0313 10:35:52.623586 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.625536 master-0 kubenswrapper[3972]: I0313 10:35:52.624298 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-pchtd"] Mar 13 10:35:52.625536 master-0 kubenswrapper[3972]: I0313 10:35:52.624686 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c"] Mar 13 10:35:52.625536 master-0 kubenswrapper[3972]: I0313 10:35:52.624957 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.625536 master-0 kubenswrapper[3972]: I0313 10:35:52.624987 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.628135 master-0 kubenswrapper[3972]: I0313 10:35:52.627940 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:35:52.628299 master-0 kubenswrapper[3972]: I0313 10:35:52.628222 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:35:52.628446 master-0 kubenswrapper[3972]: I0313 10:35:52.628311 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z"] Mar 13 10:35:52.628541 master-0 kubenswrapper[3972]: I0313 10:35:52.628513 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:35:52.628865 master-0 kubenswrapper[3972]: I0313 10:35:52.628795 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq"] Mar 13 10:35:52.629132 master-0 kubenswrapper[3972]: I0313 10:35:52.629041 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:35:52.629132 master-0 kubenswrapper[3972]: I0313 10:35:52.629069 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.634154 master-0 kubenswrapper[3972]: I0313 10:35:52.634114 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:35:52.634334 master-0 kubenswrapper[3972]: I0313 10:35:52.634214 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:35:52.634378 master-0 kubenswrapper[3972]: I0313 10:35:52.634350 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.634446 master-0 kubenswrapper[3972]: I0313 10:35:52.634405 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:35:52.634630 master-0 kubenswrapper[3972]: I0313 10:35:52.634606 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.634630 master-0 kubenswrapper[3972]: I0313 10:35:52.634620 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz"] Mar 13 10:35:52.634798 master-0 kubenswrapper[3972]: I0313 10:35:52.634773 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:35:52.641334 master-0 kubenswrapper[3972]: I0313 10:35:52.641283 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.643520 master-0 kubenswrapper[3972]: I0313 10:35:52.643479 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:35:52.644483 master-0 kubenswrapper[3972]: I0313 10:35:52.644457 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:35:52.644876 master-0 kubenswrapper[3972]: I0313 10:35:52.644856 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.645657 master-0 kubenswrapper[3972]: I0313 10:35:52.645629 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:35:52.645750 master-0 kubenswrapper[3972]: I0313 10:35:52.645712 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:35:52.656815 master-0 kubenswrapper[3972]: I0313 10:35:52.656773 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd"] Mar 13 10:35:52.657238 master-0 kubenswrapper[3972]: I0313 10:35:52.657212 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:35:52.657311 master-0 kubenswrapper[3972]: I0313 10:35:52.657251 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.657638 master-0 kubenswrapper[3972]: I0313 10:35:52.657592 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-6zkqh"] Mar 13 10:35:52.657900 master-0 kubenswrapper[3972]: I0313 10:35:52.657869 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.658400 master-0 kubenswrapper[3972]: I0313 10:35:52.658366 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:35:52.658498 master-0 kubenswrapper[3972]: I0313 10:35:52.657996 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:52.658745 master-0 kubenswrapper[3972]: I0313 10:35:52.658724 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:35:52.658815 master-0 kubenswrapper[3972]: I0313 10:35:52.658743 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:35:52.658815 master-0 kubenswrapper[3972]: I0313 10:35:52.657966 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:52.658928 master-0 kubenswrapper[3972]: I0313 10:35:52.658841 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:35:52.658981 master-0 kubenswrapper[3972]: I0313 10:35:52.658929 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.659076 master-0 kubenswrapper[3972]: I0313 10:35:52.659007 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:35:52.659184 master-0 kubenswrapper[3972]: I0313 10:35:52.659119 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.659256 master-0 kubenswrapper[3972]: I0313 10:35:52.659236 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:35:52.659386 master-0 kubenswrapper[3972]: I0313 10:35:52.659314 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:35:52.659456 master-0 kubenswrapper[3972]: I0313 10:35:52.659425 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.659456 master-0 kubenswrapper[3972]: I0313 10:35:52.659448 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:35:52.659554 master-0 kubenswrapper[3972]: I0313 10:35:52.659536 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:35:52.659620 master-0 kubenswrapper[3972]: I0313 10:35:52.659602 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:35:52.659769 master-0 kubenswrapper[3972]: I0313 10:35:52.659747 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.659855 master-0 kubenswrapper[3972]: I0313 10:35:52.659836 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:35:52.659927 master-0 kubenswrapper[3972]: I0313 10:35:52.659910 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:35:52.659982 master-0 kubenswrapper[3972]: I0313 10:35:52.659942 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.660029 master-0 kubenswrapper[3972]: I0313 10:35:52.659991 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:35:52.660076 master-0 kubenswrapper[3972]: I0313 10:35:52.660036 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:35:52.660076 master-0 kubenswrapper[3972]: I0313 10:35:52.660058 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.660183 master-0 kubenswrapper[3972]: I0313 10:35:52.660152 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:35:52.660254 master-0 kubenswrapper[3972]: I0313 10:35:52.660237 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:35:52.660472 master-0 kubenswrapper[3972]: I0313 10:35:52.660454 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:35:52.660592 master-0 kubenswrapper[3972]: I0313 10:35:52.660574 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:35:52.660699 master-0 kubenswrapper[3972]: I0313 10:35:52.660680 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:35:52.660792 master-0 kubenswrapper[3972]: I0313 10:35:52.660776 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:35:52.660897 master-0 kubenswrapper[3972]: I0313 10:35:52.660879 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:35:52.661223 master-0 kubenswrapper[3972]: I0313 10:35:52.661165 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j"] Mar 13 10:35:52.661352 master-0 kubenswrapper[3972]: I0313 10:35:52.661294 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:35:52.661452 master-0 kubenswrapper[3972]: I0313 10:35:52.661297 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:35:52.670007 master-0 kubenswrapper[3972]: I0313 10:35:52.669944 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:35:52.674076 master-0 kubenswrapper[3972]: I0313 10:35:52.674025 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:35:52.745912 master-0 kubenswrapper[3972]: I0313 10:35:52.675048 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:35:52.745912 master-0 kubenswrapper[3972]: I0313 10:35:52.676723 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:35:52.746219 master-0 kubenswrapper[3972]: I0313 10:35:52.676976 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:35:52.746219 master-0 kubenswrapper[3972]: I0313 10:35:52.677029 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.746219 master-0 kubenswrapper[3972]: I0313 10:35:52.677203 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:35:52.746369 master-0 kubenswrapper[3972]: I0313 10:35:52.746214 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.746369 master-0 kubenswrapper[3972]: I0313 10:35:52.746313 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.746369 master-0 kubenswrapper[3972]: I0313 10:35:52.746345 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.751797 master-0 kubenswrapper[3972]: I0313 10:35:52.747716 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.751797 master-0 kubenswrapper[3972]: I0313 10:35:52.749315 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:35:52.751797 master-0 kubenswrapper[3972]: I0313 10:35:52.749851 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:35:52.751797 master-0 kubenswrapper[3972]: I0313 10:35:52.750020 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:35:52.751797 master-0 kubenswrapper[3972]: I0313 10:35:52.750787 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.752376 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk"] Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.752905 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.753158 3972 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.753299 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.753502 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm"] Mar 13 10:35:52.754276 master-0 kubenswrapper[3972]: I0313 10:35:52.753569 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-b2ss8"] Mar 13 10:35:52.754767 master-0 kubenswrapper[3972]: I0313 10:35:52.754716 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.754767 master-0 kubenswrapper[3972]: I0313 10:35:52.754760 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf"] Mar 13 10:35:52.762314 master-0 kubenswrapper[3972]: I0313 10:35:52.762072 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26"] Mar 13 10:35:52.770159 master-0 kubenswrapper[3972]: I0313 10:35:52.764734 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn"] Mar 13 10:35:52.770159 master-0 kubenswrapper[3972]: I0313 10:35:52.768128 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7"] Mar 13 10:35:52.770534 master-0 kubenswrapper[3972]: I0313 10:35:52.770495 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr"] Mar 13 10:35:52.775194 master-0 kubenswrapper[3972]: I0313 10:35:52.775138 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh"] Mar 13 10:35:52.775777 master-0 kubenswrapper[3972]: I0313 10:35:52.775750 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq"] Mar 13 10:35:52.776551 master-0 kubenswrapper[3972]: I0313 10:35:52.776509 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-pchtd"] Mar 13 10:35:52.777245 master-0 kubenswrapper[3972]: I0313 10:35:52.777219 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs"] Mar 13 10:35:52.778069 master-0 kubenswrapper[3972]: I0313 10:35:52.778029 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-6zkqh"] Mar 13 10:35:52.778989 master-0 kubenswrapper[3972]: I0313 10:35:52.778963 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz"] Mar 13 10:35:52.780588 master-0 kubenswrapper[3972]: I0313 10:35:52.780558 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8"] Mar 13 10:35:52.781617 master-0 kubenswrapper[3972]: I0313 10:35:52.781587 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z"] Mar 13 10:35:52.782440 master-0 kubenswrapper[3972]: I0313 10:35:52.782411 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-4v99n"] Mar 13 10:35:52.783501 master-0 kubenswrapper[3972]: I0313 10:35:52.783463 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c"] Mar 13 10:35:52.784158 master-0 kubenswrapper[3972]: I0313 10:35:52.784107 3972 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-55t7x"] Mar 13 10:35:52.784857 master-0 kubenswrapper[3972]: I0313 10:35:52.784827 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:52.785160 master-0 kubenswrapper[3972]: I0313 10:35:52.785131 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd"] Mar 13 10:35:52.786537 master-0 kubenswrapper[3972]: I0313 10:35:52.785996 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9"] Mar 13 10:35:52.786799 master-0 kubenswrapper[3972]: I0313 10:35:52.786765 3972 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:35:52.786989 master-0 kubenswrapper[3972]: I0313 10:35:52.786954 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:35:52.800721 master-0 kubenswrapper[3972]: I0313 10:35:52.800679 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847780 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847822 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847851 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847874 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847900 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847923 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847939 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.847949 master-0 kubenswrapper[3972]: I0313 10:35:52.847955 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.847977 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.847999 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848019 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848035 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848050 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848066 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848088 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848116 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848132 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848150 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848164 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848180 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848199 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.848392 master-0 kubenswrapper[3972]: I0313 10:35:52.848214 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848228 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848251 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848279 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848293 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848324 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848341 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848364 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848387 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848413 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848433 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848453 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848475 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848503 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.848784 master-0 kubenswrapper[3972]: I0313 10:35:52.848522 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848537 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848557 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848576 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848592 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848607 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848623 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848659 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848674 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848695 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848711 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848726 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848749 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848763 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.849309 master-0 kubenswrapper[3972]: I0313 10:35:52.848777 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848804 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848821 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848844 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848868 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848883 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848898 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848934 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848949 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.848978 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.849010 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.849025 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.849051 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.849065 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.849717 master-0 kubenswrapper[3972]: I0313 10:35:52.849114 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.949690 master-0 kubenswrapper[3972]: I0313 10:35:52.949656 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.950162 master-0 kubenswrapper[3972]: I0313 10:35:52.950131 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.950395 master-0 kubenswrapper[3972]: I0313 10:35:52.950376 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.950488 master-0 kubenswrapper[3972]: I0313 10:35:52.950475 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.950573 master-0 kubenswrapper[3972]: I0313 10:35:52.950557 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.950670 master-0 kubenswrapper[3972]: I0313 10:35:52.950656 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.950769 master-0 kubenswrapper[3972]: I0313 10:35:52.950753 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.950861 master-0 kubenswrapper[3972]: I0313 10:35:52.950848 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.950943 master-0 kubenswrapper[3972]: I0313 10:35:52.950927 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.951035 master-0 kubenswrapper[3972]: I0313 10:35:52.951022 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.951139 master-0 kubenswrapper[3972]: I0313 10:35:52.951123 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.951233 master-0 kubenswrapper[3972]: I0313 10:35:52.951220 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.951521 master-0 kubenswrapper[3972]: I0313 10:35:52.951486 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.951625 master-0 kubenswrapper[3972]: I0313 10:35:52.951579 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.951677 master-0 kubenswrapper[3972]: I0313 10:35:52.951656 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.952228 master-0 kubenswrapper[3972]: I0313 10:35:52.952210 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.952399 master-0 kubenswrapper[3972]: I0313 10:35:52.952365 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.953234 master-0 kubenswrapper[3972]: I0313 10:35:52.953190 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.953294 master-0 kubenswrapper[3972]: I0313 10:35:52.953275 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.953357 master-0 kubenswrapper[3972]: I0313 10:35:52.953302 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.953396 master-0 kubenswrapper[3972]: I0313 10:35:52.953369 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.953429 master-0 kubenswrapper[3972]: I0313 10:35:52.953399 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.953461 master-0 kubenswrapper[3972]: I0313 10:35:52.953444 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.953533 master-0 kubenswrapper[3972]: I0313 10:35:52.953511 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.953573 master-0 kubenswrapper[3972]: I0313 10:35:52.953541 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.953573 master-0 kubenswrapper[3972]: E0313 10:35:52.953550 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:35:52.953573 master-0 kubenswrapper[3972]: I0313 10:35:52.953567 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.953663 master-0 kubenswrapper[3972]: I0313 10:35:52.953590 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.953663 master-0 kubenswrapper[3972]: E0313 10:35:52.953630 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.453595956 +0000 UTC m=+156.071712344 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:35:52.953663 master-0 kubenswrapper[3972]: I0313 10:35:52.953649 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.953789 master-0 kubenswrapper[3972]: I0313 10:35:52.953675 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.953789 master-0 kubenswrapper[3972]: I0313 10:35:52.953664 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.953789 master-0 kubenswrapper[3972]: I0313 10:35:52.953697 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:52.953789 master-0 kubenswrapper[3972]: I0313 10:35:52.953737 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.953789 master-0 kubenswrapper[3972]: E0313 10:35:52.953687 3972 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.956989 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.957070 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.957151 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.957184 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.957217 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.959454 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.959531 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.959564 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.959595 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.960627 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.961977 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.962541 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: E0313 10:35:52.963594 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: I0313 10:35:52.963681 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.971124 master-0 kubenswrapper[3972]: E0313 10:35:52.963706 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.463680579 +0000 UTC m=+156.081796987 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.964559 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.964915 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: E0313 10:35:52.965033 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.465017724 +0000 UTC m=+156.083134122 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.965363 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.965403 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: E0313 10:35:52.965483 3972 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: E0313 10:35:52.965523 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.465505327 +0000 UTC m=+156.083621735 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.965600 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.966751 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967012 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967192 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967246 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967277 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967299 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.971807 master-0 kubenswrapper[3972]: I0313 10:35:52.967316 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967336 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967356 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967381 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967400 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967424 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967446 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967466 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967554 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967655 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967714 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967773 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967799 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967823 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967845 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:52.972267 master-0 kubenswrapper[3972]: I0313 10:35:52.967864 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967884 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967900 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967919 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967950 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967971 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.967995 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.968028 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.968050 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.968071 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.968120 3972 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: E0313 10:35:52.968413 3972 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: E0313 10:35:52.968506 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.468465564 +0000 UTC m=+156.086581962 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: I0313 10:35:52.969065 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: E0313 10:35:52.969254 3972 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:35:52.972732 master-0 kubenswrapper[3972]: E0313 10:35:52.969328 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.469284655 +0000 UTC m=+156.087401043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.969409 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.969442 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.469434079 +0000 UTC m=+156.087550467 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.969542 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.969567 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.469556882 +0000 UTC m=+156.087673270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: I0313 10:35:52.969994 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: I0313 10:35:52.970125 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970332 3972 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970384 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.470368273 +0000 UTC m=+156.088484681 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970450 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970484 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.470475406 +0000 UTC m=+156.088591804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970544 3972 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: E0313 10:35:52.970573 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:53.470560769 +0000 UTC m=+156.088677167 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: I0313 10:35:52.970905 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: I0313 10:35:52.971923 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.973187 master-0 kubenswrapper[3972]: I0313 10:35:52.971919 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.973890 master-0 kubenswrapper[3972]: I0313 10:35:52.972848 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.973890 master-0 kubenswrapper[3972]: I0313 10:35:52.972901 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:52.973890 master-0 kubenswrapper[3972]: I0313 10:35:52.973196 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:52.980528 master-0 kubenswrapper[3972]: I0313 10:35:52.976576 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.980528 master-0 kubenswrapper[3972]: I0313 10:35:52.977457 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.980528 master-0 kubenswrapper[3972]: I0313 10:35:52.978751 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.986770 master-0 kubenswrapper[3972]: I0313 10:35:52.986730 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:52.989282 master-0 kubenswrapper[3972]: I0313 10:35:52.988023 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.991853 master-0 kubenswrapper[3972]: I0313 10:35:52.991816 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:52.992429 master-0 kubenswrapper[3972]: I0313 10:35:52.992399 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.992547 master-0 kubenswrapper[3972]: I0313 10:35:52.992517 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:52.992682 master-0 kubenswrapper[3972]: I0313 10:35:52.992645 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:52.992981 master-0 kubenswrapper[3972]: I0313 10:35:52.992956 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:52.993194 master-0 kubenswrapper[3972]: I0313 10:35:52.993175 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:52.993528 master-0 kubenswrapper[3972]: I0313 10:35:52.993502 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:52.994073 master-0 kubenswrapper[3972]: I0313 10:35:52.994048 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:52.995864 master-0 kubenswrapper[3972]: I0313 10:35:52.995808 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:52.995992 master-0 kubenswrapper[3972]: I0313 10:35:52.995919 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:52.996297 master-0 kubenswrapper[3972]: I0313 10:35:52.996254 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:52.996358 master-0 kubenswrapper[3972]: I0313 10:35:52.996308 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:52.996719 master-0 kubenswrapper[3972]: I0313 10:35:52.996682 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:52.999528 master-0 kubenswrapper[3972]: I0313 10:35:52.999491 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:52.999957 master-0 kubenswrapper[3972]: I0313 10:35:52.999915 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:53.007065 master-0 kubenswrapper[3972]: I0313 10:35:53.007022 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:53.024332 master-0 kubenswrapper[3972]: I0313 10:35:53.024253 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:53.042517 master-0 kubenswrapper[3972]: I0313 10:35:53.042427 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:53.059455 master-0 kubenswrapper[3972]: I0313 10:35:53.059390 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:53.069826 master-0 kubenswrapper[3972]: I0313 10:35:53.069756 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.069971 master-0 kubenswrapper[3972]: I0313 10:35:53.069927 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.070035 master-0 kubenswrapper[3972]: I0313 10:35:53.069994 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.070261 master-0 kubenswrapper[3972]: I0313 10:35:53.070222 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.071530 master-0 kubenswrapper[3972]: I0313 10:35:53.071484 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.089111 master-0 kubenswrapper[3972]: I0313 10:35:53.089001 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:53.098387 master-0 kubenswrapper[3972]: I0313 10:35:53.098329 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:53.099919 master-0 kubenswrapper[3972]: I0313 10:35:53.099875 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:35:53.102718 master-0 kubenswrapper[3972]: I0313 10:35:53.102032 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:35:53.124717 master-0 kubenswrapper[3972]: I0313 10:35:53.124624 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:35:53.139281 master-0 kubenswrapper[3972]: I0313 10:35:53.139211 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:53.148600 master-0 kubenswrapper[3972]: I0313 10:35:53.148524 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:35:53.160996 master-0 kubenswrapper[3972]: I0313 10:35:53.160924 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:35:53.170760 master-0 kubenswrapper[3972]: I0313 10:35:53.170667 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:35:53.170760 master-0 kubenswrapper[3972]: I0313 10:35:53.170743 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:53.180957 master-0 kubenswrapper[3972]: I0313 10:35:53.180892 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:35:53.191062 master-0 kubenswrapper[3972]: I0313 10:35:53.190998 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:35:53.201844 master-0 kubenswrapper[3972]: I0313 10:35:53.201800 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:35:53.202967 master-0 kubenswrapper[3972]: I0313 10:35:53.202921 3972 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.210719 master-0 kubenswrapper[3972]: I0313 10:35:53.210657 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:35:53.219629 master-0 kubenswrapper[3972]: I0313 10:35:53.217258 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:35:53.224242 master-0 kubenswrapper[3972]: I0313 10:35:53.221403 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:35:53.244446 master-0 kubenswrapper[3972]: I0313 10:35:53.243470 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:35:53.248255 master-0 kubenswrapper[3972]: I0313 10:35:53.246708 3972 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:35:53.446752 master-0 kubenswrapper[3972]: I0313 10:35:53.446705 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j"] Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: W0313 10:35:53.473732 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53da2840_4a92_497a_a9d3_973583887147.slice/crio-aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350 WatchSource:0}: Error finding container aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350: Status 404 returned error can't find the container with id aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350 Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474085 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474214 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: E0313 10:35:53.474340 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: E0313 10:35:53.474344 3972 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: E0313 10:35:53.474391 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474376359 +0000 UTC m=+157.092492747 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474424 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474473 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474506 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474535 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474585 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474611 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474646 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474688 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:53.475348 master-0 kubenswrapper[3972]: I0313 10:35:53.474725 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474783 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474771849 +0000 UTC m=+157.092888237 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474812 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474829 3972 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474852 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474835551 +0000 UTC m=+157.092952019 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474876 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474874 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474864262 +0000 UTC m=+157.092980740 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474901 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474894782 +0000 UTC m=+157.093011170 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474917 3972 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.474949 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.474940074 +0000 UTC m=+157.093056572 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.475002 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.475025 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.475017696 +0000 UTC m=+157.093134174 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.475072 3972 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.475111 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.475087837 +0000 UTC m=+157.093204345 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:35:53.476004 master-0 kubenswrapper[3972]: E0313 10:35:53.475157 3972 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:35:53.476684 master-0 kubenswrapper[3972]: E0313 10:35:53.475183 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.47517541 +0000 UTC m=+157.093291888 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:35:53.476684 master-0 kubenswrapper[3972]: E0313 10:35:53.475227 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:35:53.476684 master-0 kubenswrapper[3972]: E0313 10:35:53.475250 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.475242012 +0000 UTC m=+157.093358490 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:35:53.476684 master-0 kubenswrapper[3972]: E0313 10:35:53.475292 3972 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:35:53.476684 master-0 kubenswrapper[3972]: E0313 10:35:53.475316 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:54.475308443 +0000 UTC m=+157.093424921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:35:53.521219 master-0 kubenswrapper[3972]: I0313 10:35:53.521148 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9"] Mar 13 10:35:53.539903 master-0 kubenswrapper[3972]: W0313 10:35:53.539633 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podba3e43ba_2840_4612_a370_87ad3c5a382a.slice/crio-7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524 WatchSource:0}: Error finding container 7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524: Status 404 returned error can't find the container with id 7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524 Mar 13 10:35:53.793907 master-0 kubenswrapper[3972]: I0313 10:35:53.793426 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8"] Mar 13 10:35:53.799141 master-0 kubenswrapper[3972]: I0313 10:35:53.799079 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk"] Mar 13 10:35:53.802036 master-0 kubenswrapper[3972]: W0313 10:35:53.801995 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6e69683c_59c5_43da_b105_ef2efb2d0a4e.slice/crio-0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2 WatchSource:0}: Error finding container 0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2: Status 404 returned error can't find the container with id 0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2 Mar 13 10:35:53.802262 master-0 kubenswrapper[3972]: I0313 10:35:53.802177 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c"] Mar 13 10:35:53.805910 master-0 kubenswrapper[3972]: I0313 10:35:53.805873 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd"] Mar 13 10:35:53.807254 master-0 kubenswrapper[3972]: I0313 10:35:53.807220 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq"] Mar 13 10:35:53.812666 master-0 kubenswrapper[3972]: I0313 10:35:53.812616 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-pchtd"] Mar 13 10:35:53.814778 master-0 kubenswrapper[3972]: I0313 10:35:53.814742 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr"] Mar 13 10:35:53.816528 master-0 kubenswrapper[3972]: I0313 10:35:53.816493 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz"] Mar 13 10:35:53.817763 master-0 kubenswrapper[3972]: W0313 10:35:53.817716 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8c7f667_d30e_41f4_8c0e_f3f138bffab4.slice/crio-7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9 WatchSource:0}: Error finding container 7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9: Status 404 returned error can't find the container with id 7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9 Mar 13 10:35:53.836539 master-0 kubenswrapper[3972]: I0313 10:35:53.836498 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z"] Mar 13 10:35:53.852293 master-0 kubenswrapper[3972]: W0313 10:35:53.852237 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode87ca16c_25de_4fea_b900_2960f4a5f95e.slice/crio-83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300 WatchSource:0}: Error finding container 83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300: Status 404 returned error can't find the container with id 83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300 Mar 13 10:35:53.859213 master-0 kubenswrapper[3972]: I0313 10:35:53.859180 3972 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh"] Mar 13 10:35:53.870921 master-0 kubenswrapper[3972]: W0313 10:35:53.870835 3972 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod893dac15_d6d4_4a1f_988c_59aaf9e63334.slice/crio-7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e WatchSource:0}: Error finding container 7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e: Status 404 returned error can't find the container with id 7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e Mar 13 10:35:54.207159 master-0 kubenswrapper[3972]: I0313 10:35:54.207110 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e"} Mar 13 10:35:54.208947 master-0 kubenswrapper[3972]: I0313 10:35:54.208889 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerStarted","Data":"a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018"} Mar 13 10:35:54.210269 master-0 kubenswrapper[3972]: I0313 10:35:54.210228 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-55t7x" event={"ID":"58685de6-b4ae-4229-870b-5143a6010450","Type":"ContainerStarted","Data":"bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5"} Mar 13 10:35:54.211374 master-0 kubenswrapper[3972]: I0313 10:35:54.211331 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350"} Mar 13 10:35:54.213079 master-0 kubenswrapper[3972]: I0313 10:35:54.213021 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba"} Mar 13 10:35:54.213079 master-0 kubenswrapper[3972]: I0313 10:35:54.213055 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327"} Mar 13 10:35:54.215668 master-0 kubenswrapper[3972]: I0313 10:35:54.215593 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9"} Mar 13 10:35:54.216935 master-0 kubenswrapper[3972]: I0313 10:35:54.216891 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerStarted","Data":"83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300"} Mar 13 10:35:54.218150 master-0 kubenswrapper[3972]: I0313 10:35:54.218065 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerStarted","Data":"e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2"} Mar 13 10:35:54.218861 master-0 kubenswrapper[3972]: I0313 10:35:54.218807 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251"} Mar 13 10:35:54.219876 master-0 kubenswrapper[3972]: I0313 10:35:54.219817 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerStarted","Data":"3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372"} Mar 13 10:35:54.220655 master-0 kubenswrapper[3972]: I0313 10:35:54.220633 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00"} Mar 13 10:35:54.222309 master-0 kubenswrapper[3972]: I0313 10:35:54.222256 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerStarted","Data":"7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524"} Mar 13 10:35:54.223075 master-0 kubenswrapper[3972]: I0313 10:35:54.223035 3972 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerStarted","Data":"0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2"} Mar 13 10:35:54.228083 master-0 kubenswrapper[3972]: I0313 10:35:54.228001 3972 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" podStartSLOduration=115.227962113 podStartE2EDuration="1m55.227962113s" podCreationTimestamp="2026-03-13 10:33:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:35:54.227132501 +0000 UTC m=+156.845248889" watchObservedRunningTime="2026-03-13 10:35:54.227962113 +0000 UTC m=+156.846078501" Mar 13 10:35:54.484900 master-0 kubenswrapper[3972]: E0313 10:35:54.484771 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:35:54.484900 master-0 kubenswrapper[3972]: E0313 10:35:54.484882 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.484855123 +0000 UTC m=+159.102971571 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.484524 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.485781 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.485814 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.485840 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.485975 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486018 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486045 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486143 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486271 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486325 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486353 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: I0313 10:35:54.486376 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:35:54.489173 master-0 kubenswrapper[3972]: E0313 10:35:54.486574 3972 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:35:54.490559 master-0 kubenswrapper[3972]: E0313 10:35:54.486760 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:58.486737692 +0000 UTC m=+221.104854080 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:35:54.490559 master-0 kubenswrapper[3972]: E0313 10:35:54.490226 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:35:54.490559 master-0 kubenswrapper[3972]: E0313 10:35:54.490484 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490467999 +0000 UTC m=+159.108584377 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:35:54.490559 master-0 kubenswrapper[3972]: E0313 10:35:54.490276 3972 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:54.490559 master-0 kubenswrapper[3972]: E0313 10:35:54.490521 3972 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490523 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.49051583 +0000 UTC m=+159.108632318 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490624 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490598283 +0000 UTC m=+159.108714671 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490314 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490651 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490643594 +0000 UTC m=+159.108759982 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490339 3972 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490671 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490667164 +0000 UTC m=+159.108783552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490372 3972 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490693 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490688455 +0000 UTC m=+159.108804843 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490368 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490714 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490710116 +0000 UTC m=+159.108826504 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490401 3972 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490735 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490730086 +0000 UTC m=+159.108846474 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:35:54.490805 master-0 kubenswrapper[3972]: E0313 10:35:54.490315 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:35:54.491419 master-0 kubenswrapper[3972]: E0313 10:35:54.490760 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.490753637 +0000 UTC m=+159.108870015 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:35:54.503155 master-0 kubenswrapper[3972]: E0313 10:35:54.503060 3972 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:54.503601 master-0 kubenswrapper[3972]: E0313 10:35:54.503575 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:35:56.503503699 +0000 UTC m=+159.121620087 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:35:56.507168 master-0 kubenswrapper[3972]: I0313 10:35:56.506936 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507249 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507290 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507333 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507358 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507407 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507432 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507481 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: E0313 10:35:56.507488 3972 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507538 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507575 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: I0313 10:35:56.507603 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: E0313 10:35:56.507689 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.507657401 +0000 UTC m=+163.125773799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: E0313 10:35:56.507710 3972 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: E0313 10:35:56.507755 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.507738323 +0000 UTC m=+163.125854711 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:35:56.508665 master-0 kubenswrapper[3972]: E0313 10:35:56.507820 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.507860 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.507840605 +0000 UTC m=+163.125956993 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.507917 3972 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.507955 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.507938748 +0000 UTC m=+163.126055136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508002 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508027 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.50801904 +0000 UTC m=+163.126135428 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508075 3972 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508118 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.508109372 +0000 UTC m=+163.126225760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508182 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508207 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.508199575 +0000 UTC m=+163.126315963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508254 3972 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508279 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.508271247 +0000 UTC m=+163.126387645 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508324 3972 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508347 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.508339248 +0000 UTC m=+163.126455646 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:35:56.509397 master-0 kubenswrapper[3972]: E0313 10:35:56.508391 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:56.509862 master-0 kubenswrapper[3972]: E0313 10:35:56.508413 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.50840629 +0000 UTC m=+163.126522678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:35:56.509862 master-0 kubenswrapper[3972]: E0313 10:35:56.508473 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:35:56.509862 master-0 kubenswrapper[3972]: E0313 10:35:56.508497 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:00.508489582 +0000 UTC m=+163.126605970 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:00.585111 master-0 kubenswrapper[3972]: I0313 10:36:00.584755 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: I0313 10:36:00.585137 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.584892 3972 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585275 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.58524022 +0000 UTC m=+171.203356608 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: I0313 10:36:00.585301 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585301 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: I0313 10:36:00.585338 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585372 3972 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585392 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585381864 +0000 UTC m=+171.203498252 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585414 3972 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: I0313 10:36:00.585365 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585419 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585406264 +0000 UTC m=+171.203522652 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585442 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585468 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585446955 +0000 UTC m=+171.203563343 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: E0313 10:36:00.585498 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585490946 +0000 UTC m=+171.203607334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:00.585679 master-0 kubenswrapper[3972]: I0313 10:36:00.585529 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: I0313 10:36:00.585579 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585663 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585711 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585687712 +0000 UTC m=+171.203804190 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585738 3972 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: I0313 10:36:00.585734 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585775 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585763024 +0000 UTC m=+171.203879482 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585789 3972 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: I0313 10:36:00.585808 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585824 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585814815 +0000 UTC m=+171.203931203 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585858 3972 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: I0313 10:36:00.585856 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585883 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585874996 +0000 UTC m=+171.203991384 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585904 3972 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: I0313 10:36:00.585931 3972 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:00.586318 master-0 kubenswrapper[3972]: E0313 10:36:00.585959 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.585946168 +0000 UTC m=+171.204062556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:00.586868 master-0 kubenswrapper[3972]: E0313 10:36:00.585995 3972 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:00.586868 master-0 kubenswrapper[3972]: E0313 10:36:00.586025 3972 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:08.58601767 +0000 UTC m=+171.204134058 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:01.136613 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 13 10:36:01.156491 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 10:36:01.156898 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 13 10:36:01.159274 master-0 systemd[1]: kubelet.service: Consumed 12.497s CPU time. Mar 13 10:36:01.184349 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 10:36:01.316068 master-0 kubenswrapper[7508]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:36:01.316068 master-0 kubenswrapper[7508]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 10:36:01.316068 master-0 kubenswrapper[7508]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:36:01.316068 master-0 kubenswrapper[7508]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:36:01.317038 master-0 kubenswrapper[7508]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 10:36:01.317038 master-0 kubenswrapper[7508]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:36:01.317038 master-0 kubenswrapper[7508]: I0313 10:36:01.316248 7508 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 10:36:01.322051 master-0 kubenswrapper[7508]: W0313 10:36:01.322030 7508 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:36:01.322170 master-0 kubenswrapper[7508]: W0313 10:36:01.322152 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:36:01.322254 master-0 kubenswrapper[7508]: W0313 10:36:01.322245 7508 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:36:01.322311 master-0 kubenswrapper[7508]: W0313 10:36:01.322302 7508 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:36:01.322368 master-0 kubenswrapper[7508]: W0313 10:36:01.322359 7508 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:36:01.322417 master-0 kubenswrapper[7508]: W0313 10:36:01.322407 7508 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:36:01.322473 master-0 kubenswrapper[7508]: W0313 10:36:01.322464 7508 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:36:01.322534 master-0 kubenswrapper[7508]: W0313 10:36:01.322525 7508 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:36:01.322597 master-0 kubenswrapper[7508]: W0313 10:36:01.322582 7508 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:36:01.322676 master-0 kubenswrapper[7508]: W0313 10:36:01.322668 7508 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:36:01.322735 master-0 kubenswrapper[7508]: W0313 10:36:01.322727 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:36:01.322791 master-0 kubenswrapper[7508]: W0313 10:36:01.322783 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:36:01.322841 master-0 kubenswrapper[7508]: W0313 10:36:01.322833 7508 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:36:01.322893 master-0 kubenswrapper[7508]: W0313 10:36:01.322885 7508 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:36:01.322949 master-0 kubenswrapper[7508]: W0313 10:36:01.322941 7508 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:36:01.322999 master-0 kubenswrapper[7508]: W0313 10:36:01.322991 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:36:01.323087 master-0 kubenswrapper[7508]: W0313 10:36:01.323078 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:36:01.323166 master-0 kubenswrapper[7508]: W0313 10:36:01.323157 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:36:01.323224 master-0 kubenswrapper[7508]: W0313 10:36:01.323215 7508 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:36:01.323282 master-0 kubenswrapper[7508]: W0313 10:36:01.323273 7508 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:36:01.323332 master-0 kubenswrapper[7508]: W0313 10:36:01.323324 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:36:01.323390 master-0 kubenswrapper[7508]: W0313 10:36:01.323382 7508 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:36:01.323449 master-0 kubenswrapper[7508]: W0313 10:36:01.323440 7508 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:36:01.323532 master-0 kubenswrapper[7508]: W0313 10:36:01.323523 7508 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:36:01.323592 master-0 kubenswrapper[7508]: W0313 10:36:01.323583 7508 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:36:01.323649 master-0 kubenswrapper[7508]: W0313 10:36:01.323640 7508 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:36:01.323695 master-0 kubenswrapper[7508]: W0313 10:36:01.323687 7508 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:36:01.323739 master-0 kubenswrapper[7508]: W0313 10:36:01.323731 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:36:01.323785 master-0 kubenswrapper[7508]: W0313 10:36:01.323777 7508 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:36:01.323843 master-0 kubenswrapper[7508]: W0313 10:36:01.323835 7508 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:36:01.323905 master-0 kubenswrapper[7508]: W0313 10:36:01.323896 7508 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:36:01.323987 master-0 kubenswrapper[7508]: W0313 10:36:01.323979 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:36:01.324041 master-0 kubenswrapper[7508]: W0313 10:36:01.324033 7508 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:36:01.324108 master-0 kubenswrapper[7508]: W0313 10:36:01.324087 7508 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:36:01.324178 master-0 kubenswrapper[7508]: W0313 10:36:01.324168 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:36:01.324228 master-0 kubenswrapper[7508]: W0313 10:36:01.324220 7508 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:36:01.324286 master-0 kubenswrapper[7508]: W0313 10:36:01.324278 7508 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:36:01.324344 master-0 kubenswrapper[7508]: W0313 10:36:01.324336 7508 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:36:01.324434 master-0 kubenswrapper[7508]: W0313 10:36:01.324425 7508 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:36:01.324503 master-0 kubenswrapper[7508]: W0313 10:36:01.324494 7508 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:36:01.324564 master-0 kubenswrapper[7508]: W0313 10:36:01.324555 7508 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:36:01.324622 master-0 kubenswrapper[7508]: W0313 10:36:01.324613 7508 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:36:01.324669 master-0 kubenswrapper[7508]: W0313 10:36:01.324661 7508 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:36:01.324725 master-0 kubenswrapper[7508]: W0313 10:36:01.324717 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:36:01.324809 master-0 kubenswrapper[7508]: W0313 10:36:01.324801 7508 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:36:01.324860 master-0 kubenswrapper[7508]: W0313 10:36:01.324852 7508 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:36:01.324918 master-0 kubenswrapper[7508]: W0313 10:36:01.324909 7508 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:36:01.324977 master-0 kubenswrapper[7508]: W0313 10:36:01.324969 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:36:01.325038 master-0 kubenswrapper[7508]: W0313 10:36:01.325029 7508 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:36:01.325108 master-0 kubenswrapper[7508]: W0313 10:36:01.325088 7508 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:36:01.325173 master-0 kubenswrapper[7508]: W0313 10:36:01.325164 7508 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:36:01.325236 master-0 kubenswrapper[7508]: W0313 10:36:01.325227 7508 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:36:01.325342 master-0 kubenswrapper[7508]: W0313 10:36:01.325333 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:36:01.325404 master-0 kubenswrapper[7508]: W0313 10:36:01.325396 7508 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:36:01.325462 master-0 kubenswrapper[7508]: W0313 10:36:01.325454 7508 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:36:01.325514 master-0 kubenswrapper[7508]: W0313 10:36:01.325505 7508 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:36:01.325570 master-0 kubenswrapper[7508]: W0313 10:36:01.325561 7508 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:36:01.325627 master-0 kubenswrapper[7508]: W0313 10:36:01.325618 7508 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:36:01.325684 master-0 kubenswrapper[7508]: W0313 10:36:01.325676 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:36:01.325772 master-0 kubenswrapper[7508]: W0313 10:36:01.325763 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:36:01.325832 master-0 kubenswrapper[7508]: W0313 10:36:01.325823 7508 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:36:01.325890 master-0 kubenswrapper[7508]: W0313 10:36:01.325882 7508 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:36:01.325948 master-0 kubenswrapper[7508]: W0313 10:36:01.325940 7508 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:36:01.326008 master-0 kubenswrapper[7508]: W0313 10:36:01.325999 7508 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:36:01.326064 master-0 kubenswrapper[7508]: W0313 10:36:01.326056 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:36:01.326138 master-0 kubenswrapper[7508]: W0313 10:36:01.326129 7508 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:36:01.326203 master-0 kubenswrapper[7508]: W0313 10:36:01.326194 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:36:01.326287 master-0 kubenswrapper[7508]: W0313 10:36:01.326279 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:36:01.326350 master-0 kubenswrapper[7508]: W0313 10:36:01.326341 7508 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:36:01.326398 master-0 kubenswrapper[7508]: W0313 10:36:01.326390 7508 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:36:01.326458 master-0 kubenswrapper[7508]: W0313 10:36:01.326448 7508 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:36:01.326516 master-0 kubenswrapper[7508]: W0313 10:36:01.326508 7508 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:36:01.326706 master-0 kubenswrapper[7508]: I0313 10:36:01.326689 7508 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 10:36:01.326784 master-0 kubenswrapper[7508]: I0313 10:36:01.326762 7508 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 10:36:01.326872 master-0 kubenswrapper[7508]: I0313 10:36:01.326860 7508 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 10:36:01.326940 master-0 kubenswrapper[7508]: I0313 10:36:01.326927 7508 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 10:36:01.327001 master-0 kubenswrapper[7508]: I0313 10:36:01.326992 7508 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 10:36:01.327063 master-0 kubenswrapper[7508]: I0313 10:36:01.327051 7508 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 10:36:01.327130 master-0 kubenswrapper[7508]: I0313 10:36:01.327118 7508 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 10:36:01.327196 master-0 kubenswrapper[7508]: I0313 10:36:01.327186 7508 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 10:36:01.327256 master-0 kubenswrapper[7508]: I0313 10:36:01.327247 7508 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 10:36:01.327316 master-0 kubenswrapper[7508]: I0313 10:36:01.327306 7508 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 10:36:01.327369 master-0 kubenswrapper[7508]: I0313 10:36:01.327360 7508 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 10:36:01.327460 master-0 kubenswrapper[7508]: I0313 10:36:01.327450 7508 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 10:36:01.327511 master-0 kubenswrapper[7508]: I0313 10:36:01.327502 7508 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 10:36:01.327557 master-0 kubenswrapper[7508]: I0313 10:36:01.327549 7508 flags.go:64] FLAG: --cgroup-root="" Mar 13 10:36:01.327602 master-0 kubenswrapper[7508]: I0313 10:36:01.327594 7508 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 10:36:01.327653 master-0 kubenswrapper[7508]: I0313 10:36:01.327645 7508 flags.go:64] FLAG: --client-ca-file="" Mar 13 10:36:01.327711 master-0 kubenswrapper[7508]: I0313 10:36:01.327703 7508 flags.go:64] FLAG: --cloud-config="" Mar 13 10:36:01.327772 master-0 kubenswrapper[7508]: I0313 10:36:01.327762 7508 flags.go:64] FLAG: --cloud-provider="" Mar 13 10:36:01.327833 master-0 kubenswrapper[7508]: I0313 10:36:01.327821 7508 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 10:36:01.327893 master-0 kubenswrapper[7508]: I0313 10:36:01.327884 7508 flags.go:64] FLAG: --cluster-domain="" Mar 13 10:36:01.327981 master-0 kubenswrapper[7508]: I0313 10:36:01.327971 7508 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 10:36:01.328041 master-0 kubenswrapper[7508]: I0313 10:36:01.328032 7508 flags.go:64] FLAG: --config-dir="" Mar 13 10:36:01.328112 master-0 kubenswrapper[7508]: I0313 10:36:01.328101 7508 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 10:36:01.328189 master-0 kubenswrapper[7508]: I0313 10:36:01.328177 7508 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 10:36:01.328252 master-0 kubenswrapper[7508]: I0313 10:36:01.328242 7508 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 10:36:01.328311 master-0 kubenswrapper[7508]: I0313 10:36:01.328302 7508 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 10:36:01.328370 master-0 kubenswrapper[7508]: I0313 10:36:01.328360 7508 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 10:36:01.328422 master-0 kubenswrapper[7508]: I0313 10:36:01.328413 7508 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 10:36:01.328479 master-0 kubenswrapper[7508]: I0313 10:36:01.328470 7508 flags.go:64] FLAG: --contention-profiling="false" Mar 13 10:36:01.328565 master-0 kubenswrapper[7508]: I0313 10:36:01.328556 7508 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 10:36:01.328616 master-0 kubenswrapper[7508]: I0313 10:36:01.328607 7508 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 10:36:01.328679 master-0 kubenswrapper[7508]: I0313 10:36:01.328670 7508 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 10:36:01.328743 master-0 kubenswrapper[7508]: I0313 10:36:01.328731 7508 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 10:36:01.328802 master-0 kubenswrapper[7508]: I0313 10:36:01.328793 7508 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 10:36:01.328862 master-0 kubenswrapper[7508]: I0313 10:36:01.328853 7508 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 10:36:01.328920 master-0 kubenswrapper[7508]: I0313 10:36:01.328911 7508 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 10:36:01.328978 master-0 kubenswrapper[7508]: I0313 10:36:01.328969 7508 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 10:36:01.329035 master-0 kubenswrapper[7508]: I0313 10:36:01.329026 7508 flags.go:64] FLAG: --enable-server="true" Mar 13 10:36:01.329109 master-0 kubenswrapper[7508]: I0313 10:36:01.329084 7508 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 10:36:01.329182 master-0 kubenswrapper[7508]: I0313 10:36:01.329172 7508 flags.go:64] FLAG: --event-burst="100" Mar 13 10:36:01.329306 master-0 kubenswrapper[7508]: I0313 10:36:01.329225 7508 flags.go:64] FLAG: --event-qps="50" Mar 13 10:36:01.329394 master-0 kubenswrapper[7508]: I0313 10:36:01.329363 7508 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 10:36:01.329471 master-0 kubenswrapper[7508]: I0313 10:36:01.329443 7508 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 10:36:01.329556 master-0 kubenswrapper[7508]: I0313 10:36:01.329544 7508 flags.go:64] FLAG: --eviction-hard="" Mar 13 10:36:01.329636 master-0 kubenswrapper[7508]: I0313 10:36:01.329626 7508 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 10:36:01.329715 master-0 kubenswrapper[7508]: I0313 10:36:01.329705 7508 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 10:36:01.329796 master-0 kubenswrapper[7508]: I0313 10:36:01.329786 7508 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 10:36:01.329903 master-0 kubenswrapper[7508]: I0313 10:36:01.329893 7508 flags.go:64] FLAG: --eviction-soft="" Mar 13 10:36:01.329986 master-0 kubenswrapper[7508]: I0313 10:36:01.329977 7508 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 10:36:01.330065 master-0 kubenswrapper[7508]: I0313 10:36:01.330056 7508 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 10:36:01.330155 master-0 kubenswrapper[7508]: I0313 10:36:01.330145 7508 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 10:36:01.330237 master-0 kubenswrapper[7508]: I0313 10:36:01.330228 7508 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 10:36:01.330309 master-0 kubenswrapper[7508]: I0313 10:36:01.330300 7508 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 10:36:01.330377 master-0 kubenswrapper[7508]: I0313 10:36:01.330368 7508 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 10:36:01.330473 master-0 kubenswrapper[7508]: I0313 10:36:01.330429 7508 flags.go:64] FLAG: --feature-gates="" Mar 13 10:36:01.330539 master-0 kubenswrapper[7508]: I0313 10:36:01.330528 7508 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 10:36:01.330588 master-0 kubenswrapper[7508]: I0313 10:36:01.330579 7508 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 10:36:01.330640 master-0 kubenswrapper[7508]: I0313 10:36:01.330631 7508 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 10:36:01.330700 master-0 kubenswrapper[7508]: I0313 10:36:01.330690 7508 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 10:36:01.330756 master-0 kubenswrapper[7508]: I0313 10:36:01.330746 7508 flags.go:64] FLAG: --healthz-port="10248" Mar 13 10:36:01.330814 master-0 kubenswrapper[7508]: I0313 10:36:01.330805 7508 flags.go:64] FLAG: --help="false" Mar 13 10:36:01.330867 master-0 kubenswrapper[7508]: I0313 10:36:01.330858 7508 flags.go:64] FLAG: --hostname-override="" Mar 13 10:36:01.330925 master-0 kubenswrapper[7508]: I0313 10:36:01.330916 7508 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 10:36:01.331037 master-0 kubenswrapper[7508]: I0313 10:36:01.331027 7508 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 10:36:01.331119 master-0 kubenswrapper[7508]: I0313 10:36:01.331088 7508 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 10:36:01.331182 master-0 kubenswrapper[7508]: I0313 10:36:01.331172 7508 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 10:36:01.331242 master-0 kubenswrapper[7508]: I0313 10:36:01.331233 7508 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 10:36:01.331297 master-0 kubenswrapper[7508]: I0313 10:36:01.331288 7508 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 10:36:01.331355 master-0 kubenswrapper[7508]: I0313 10:36:01.331346 7508 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 10:36:01.331416 master-0 kubenswrapper[7508]: I0313 10:36:01.331407 7508 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 10:36:01.331494 master-0 kubenswrapper[7508]: I0313 10:36:01.331485 7508 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 10:36:01.331542 master-0 kubenswrapper[7508]: I0313 10:36:01.331533 7508 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 10:36:01.331664 master-0 kubenswrapper[7508]: I0313 10:36:01.331654 7508 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 10:36:01.331725 master-0 kubenswrapper[7508]: I0313 10:36:01.331715 7508 flags.go:64] FLAG: --kube-reserved="" Mar 13 10:36:01.331773 master-0 kubenswrapper[7508]: I0313 10:36:01.331765 7508 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 10:36:01.331830 master-0 kubenswrapper[7508]: I0313 10:36:01.331821 7508 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 10:36:01.331879 master-0 kubenswrapper[7508]: I0313 10:36:01.331870 7508 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 10:36:01.331937 master-0 kubenswrapper[7508]: I0313 10:36:01.331928 7508 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 10:36:01.332018 master-0 kubenswrapper[7508]: I0313 10:36:01.332009 7508 flags.go:64] FLAG: --lock-file="" Mar 13 10:36:01.332083 master-0 kubenswrapper[7508]: I0313 10:36:01.332073 7508 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 10:36:01.332152 master-0 kubenswrapper[7508]: I0313 10:36:01.332143 7508 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 10:36:01.332219 master-0 kubenswrapper[7508]: I0313 10:36:01.332205 7508 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 10:36:01.332270 master-0 kubenswrapper[7508]: I0313 10:36:01.332261 7508 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 10:36:01.332347 master-0 kubenswrapper[7508]: I0313 10:36:01.332337 7508 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 10:36:01.332407 master-0 kubenswrapper[7508]: I0313 10:36:01.332398 7508 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 10:36:01.332465 master-0 kubenswrapper[7508]: I0313 10:36:01.332456 7508 flags.go:64] FLAG: --logging-format="text" Mar 13 10:36:01.332563 master-0 kubenswrapper[7508]: I0313 10:36:01.332553 7508 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 10:36:01.332626 master-0 kubenswrapper[7508]: I0313 10:36:01.332617 7508 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 10:36:01.332685 master-0 kubenswrapper[7508]: I0313 10:36:01.332676 7508 flags.go:64] FLAG: --manifest-url="" Mar 13 10:36:01.332742 master-0 kubenswrapper[7508]: I0313 10:36:01.332729 7508 flags.go:64] FLAG: --manifest-url-header="" Mar 13 10:36:01.332801 master-0 kubenswrapper[7508]: I0313 10:36:01.332792 7508 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 10:36:01.332862 master-0 kubenswrapper[7508]: I0313 10:36:01.332852 7508 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 10:36:01.332911 master-0 kubenswrapper[7508]: I0313 10:36:01.332902 7508 flags.go:64] FLAG: --max-pods="110" Mar 13 10:36:01.332963 master-0 kubenswrapper[7508]: I0313 10:36:01.332954 7508 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 10:36:01.333053 master-0 kubenswrapper[7508]: I0313 10:36:01.333044 7508 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 10:36:01.333127 master-0 kubenswrapper[7508]: I0313 10:36:01.333117 7508 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 10:36:01.333193 master-0 kubenswrapper[7508]: I0313 10:36:01.333183 7508 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 10:36:01.333245 master-0 kubenswrapper[7508]: I0313 10:36:01.333236 7508 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 10:36:01.333312 master-0 kubenswrapper[7508]: I0313 10:36:01.333302 7508 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 10:36:01.333380 master-0 kubenswrapper[7508]: I0313 10:36:01.333363 7508 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 10:36:01.333440 master-0 kubenswrapper[7508]: I0313 10:36:01.333431 7508 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 10:36:01.333532 master-0 kubenswrapper[7508]: I0313 10:36:01.333522 7508 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 10:36:01.333611 master-0 kubenswrapper[7508]: I0313 10:36:01.333600 7508 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 10:36:01.333678 master-0 kubenswrapper[7508]: I0313 10:36:01.333668 7508 flags.go:64] FLAG: --pod-cidr="" Mar 13 10:36:01.333743 master-0 kubenswrapper[7508]: I0313 10:36:01.333729 7508 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 10:36:01.333804 master-0 kubenswrapper[7508]: I0313 10:36:01.333795 7508 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 10:36:01.333853 master-0 kubenswrapper[7508]: I0313 10:36:01.333844 7508 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 10:36:01.333907 master-0 kubenswrapper[7508]: I0313 10:36:01.333898 7508 flags.go:64] FLAG: --pods-per-core="0" Mar 13 10:36:01.333966 master-0 kubenswrapper[7508]: I0313 10:36:01.333957 7508 flags.go:64] FLAG: --port="10250" Mar 13 10:36:01.334015 master-0 kubenswrapper[7508]: I0313 10:36:01.334006 7508 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 10:36:01.334072 master-0 kubenswrapper[7508]: I0313 10:36:01.334063 7508 flags.go:64] FLAG: --provider-id="" Mar 13 10:36:01.334164 master-0 kubenswrapper[7508]: I0313 10:36:01.334155 7508 flags.go:64] FLAG: --qos-reserved="" Mar 13 10:36:01.334230 master-0 kubenswrapper[7508]: I0313 10:36:01.334220 7508 flags.go:64] FLAG: --read-only-port="10255" Mar 13 10:36:01.334282 master-0 kubenswrapper[7508]: I0313 10:36:01.334274 7508 flags.go:64] FLAG: --register-node="true" Mar 13 10:36:01.334341 master-0 kubenswrapper[7508]: I0313 10:36:01.334332 7508 flags.go:64] FLAG: --register-schedulable="true" Mar 13 10:36:01.334405 master-0 kubenswrapper[7508]: I0313 10:36:01.334392 7508 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 10:36:01.334465 master-0 kubenswrapper[7508]: I0313 10:36:01.334456 7508 flags.go:64] FLAG: --registry-burst="10" Mar 13 10:36:01.334523 master-0 kubenswrapper[7508]: I0313 10:36:01.334515 7508 flags.go:64] FLAG: --registry-qps="5" Mar 13 10:36:01.334592 master-0 kubenswrapper[7508]: I0313 10:36:01.334583 7508 flags.go:64] FLAG: --reserved-cpus="" Mar 13 10:36:01.334661 master-0 kubenswrapper[7508]: I0313 10:36:01.334650 7508 flags.go:64] FLAG: --reserved-memory="" Mar 13 10:36:01.334749 master-0 kubenswrapper[7508]: I0313 10:36:01.334739 7508 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 10:36:01.334805 master-0 kubenswrapper[7508]: I0313 10:36:01.334796 7508 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 10:36:01.334868 master-0 kubenswrapper[7508]: I0313 10:36:01.334859 7508 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 10:36:01.334917 master-0 kubenswrapper[7508]: I0313 10:36:01.334909 7508 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 10:36:01.334975 master-0 kubenswrapper[7508]: I0313 10:36:01.334966 7508 flags.go:64] FLAG: --runonce="false" Mar 13 10:36:01.335032 master-0 kubenswrapper[7508]: I0313 10:36:01.335024 7508 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 10:36:01.335111 master-0 kubenswrapper[7508]: I0313 10:36:01.335081 7508 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 10:36:01.335177 master-0 kubenswrapper[7508]: I0313 10:36:01.335168 7508 flags.go:64] FLAG: --seccomp-default="false" Mar 13 10:36:01.335237 master-0 kubenswrapper[7508]: I0313 10:36:01.335228 7508 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 10:36:01.335299 master-0 kubenswrapper[7508]: I0313 10:36:01.335289 7508 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 10:36:01.335387 master-0 kubenswrapper[7508]: I0313 10:36:01.335377 7508 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 10:36:01.335441 master-0 kubenswrapper[7508]: I0313 10:36:01.335432 7508 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 10:36:01.335488 master-0 kubenswrapper[7508]: I0313 10:36:01.335480 7508 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 10:36:01.335547 master-0 kubenswrapper[7508]: I0313 10:36:01.335538 7508 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 10:36:01.335605 master-0 kubenswrapper[7508]: I0313 10:36:01.335596 7508 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 10:36:01.335666 master-0 kubenswrapper[7508]: I0313 10:36:01.335657 7508 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 10:36:01.335719 master-0 kubenswrapper[7508]: I0313 10:36:01.335710 7508 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 10:36:01.335778 master-0 kubenswrapper[7508]: I0313 10:36:01.335769 7508 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 10:36:01.335836 master-0 kubenswrapper[7508]: I0313 10:36:01.335827 7508 flags.go:64] FLAG: --system-cgroups="" Mar 13 10:36:01.335897 master-0 kubenswrapper[7508]: I0313 10:36:01.335885 7508 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 10:36:01.335979 master-0 kubenswrapper[7508]: I0313 10:36:01.335970 7508 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 10:36:01.336026 master-0 kubenswrapper[7508]: I0313 10:36:01.336018 7508 flags.go:64] FLAG: --tls-cert-file="" Mar 13 10:36:01.336087 master-0 kubenswrapper[7508]: I0313 10:36:01.336076 7508 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 10:36:01.336171 master-0 kubenswrapper[7508]: I0313 10:36:01.336160 7508 flags.go:64] FLAG: --tls-min-version="" Mar 13 10:36:01.336248 master-0 kubenswrapper[7508]: I0313 10:36:01.336239 7508 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 10:36:01.336294 master-0 kubenswrapper[7508]: I0313 10:36:01.336286 7508 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 10:36:01.336346 master-0 kubenswrapper[7508]: I0313 10:36:01.336337 7508 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 10:36:01.336405 master-0 kubenswrapper[7508]: I0313 10:36:01.336396 7508 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 10:36:01.336470 master-0 kubenswrapper[7508]: I0313 10:36:01.336454 7508 flags.go:64] FLAG: --v="2" Mar 13 10:36:01.336533 master-0 kubenswrapper[7508]: I0313 10:36:01.336522 7508 flags.go:64] FLAG: --version="false" Mar 13 10:36:01.336622 master-0 kubenswrapper[7508]: I0313 10:36:01.336583 7508 flags.go:64] FLAG: --vmodule="" Mar 13 10:36:01.336688 master-0 kubenswrapper[7508]: I0313 10:36:01.336678 7508 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 10:36:01.336748 master-0 kubenswrapper[7508]: I0313 10:36:01.336739 7508 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 10:36:01.336913 master-0 kubenswrapper[7508]: W0313 10:36:01.336903 7508 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:36:01.336971 master-0 kubenswrapper[7508]: W0313 10:36:01.336963 7508 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:36:01.337026 master-0 kubenswrapper[7508]: W0313 10:36:01.337018 7508 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:36:01.337083 master-0 kubenswrapper[7508]: W0313 10:36:01.337075 7508 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:36:01.337197 master-0 kubenswrapper[7508]: W0313 10:36:01.337187 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:36:01.337259 master-0 kubenswrapper[7508]: W0313 10:36:01.337250 7508 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:36:01.337360 master-0 kubenswrapper[7508]: W0313 10:36:01.337350 7508 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:36:01.337420 master-0 kubenswrapper[7508]: W0313 10:36:01.337412 7508 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:36:01.337477 master-0 kubenswrapper[7508]: W0313 10:36:01.337469 7508 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:36:01.337527 master-0 kubenswrapper[7508]: W0313 10:36:01.337519 7508 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:36:01.337583 master-0 kubenswrapper[7508]: W0313 10:36:01.337575 7508 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:36:01.337635 master-0 kubenswrapper[7508]: W0313 10:36:01.337627 7508 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337683 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337690 7508 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337694 7508 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337698 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337702 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337706 7508 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337709 7508 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337715 7508 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337720 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337724 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337729 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337733 7508 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337737 7508 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337741 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337745 7508 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337749 7508 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337752 7508 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337756 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:36:01.337862 master-0 kubenswrapper[7508]: W0313 10:36:01.337759 7508 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337763 7508 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337767 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337770 7508 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337774 7508 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337777 7508 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337781 7508 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337787 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337791 7508 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337794 7508 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337798 7508 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337803 7508 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337808 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337812 7508 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337815 7508 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337819 7508 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337822 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337826 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:36:01.338417 master-0 kubenswrapper[7508]: W0313 10:36:01.337830 7508 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339298 7508 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339311 7508 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339315 7508 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339319 7508 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339323 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339327 7508 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339333 7508 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339338 7508 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339342 7508 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339346 7508 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339351 7508 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339354 7508 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339358 7508 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339362 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339365 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339369 7508 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339373 7508 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339376 7508 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:36:01.339587 master-0 kubenswrapper[7508]: W0313 10:36:01.339380 7508 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:36:01.340141 master-0 kubenswrapper[7508]: W0313 10:36:01.339383 7508 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:36:01.340141 master-0 kubenswrapper[7508]: W0313 10:36:01.339387 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:36:01.340141 master-0 kubenswrapper[7508]: W0313 10:36:01.339391 7508 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:36:01.340141 master-0 kubenswrapper[7508]: W0313 10:36:01.339396 7508 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:36:01.340141 master-0 kubenswrapper[7508]: I0313 10:36:01.339418 7508 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:36:01.350487 master-0 kubenswrapper[7508]: I0313 10:36:01.350415 7508 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 10:36:01.350627 master-0 kubenswrapper[7508]: I0313 10:36:01.350571 7508 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 10:36:01.350737 master-0 kubenswrapper[7508]: W0313 10:36:01.350714 7508 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:36:01.350737 master-0 kubenswrapper[7508]: W0313 10:36:01.350728 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:36:01.350737 master-0 kubenswrapper[7508]: W0313 10:36:01.350732 7508 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:36:01.350737 master-0 kubenswrapper[7508]: W0313 10:36:01.350737 7508 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:36:01.350737 master-0 kubenswrapper[7508]: W0313 10:36:01.350742 7508 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350746 7508 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350751 7508 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350782 7508 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350792 7508 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350797 7508 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350801 7508 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350805 7508 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350810 7508 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350813 7508 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350817 7508 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350822 7508 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350826 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350851 7508 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350857 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350860 7508 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350865 7508 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350870 7508 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350874 7508 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:36:01.350892 master-0 kubenswrapper[7508]: W0313 10:36:01.350879 7508 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350883 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350887 7508 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350890 7508 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350894 7508 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350898 7508 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350949 7508 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350956 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350960 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350964 7508 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350968 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350976 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350980 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350984 7508 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350987 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.350991 7508 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.351124 7508 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.351129 7508 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.351134 7508 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.351137 7508 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:36:01.351528 master-0 kubenswrapper[7508]: W0313 10:36:01.351141 7508 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351145 7508 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351149 7508 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351153 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351157 7508 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351160 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351164 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351167 7508 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351171 7508 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351260 7508 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351273 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351280 7508 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351284 7508 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351287 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351296 7508 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351300 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351305 7508 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351310 7508 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351314 7508 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:36:01.352115 master-0 kubenswrapper[7508]: W0313 10:36:01.351322 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351325 7508 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351329 7508 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351333 7508 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351338 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351341 7508 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351345 7508 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351349 7508 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351454 7508 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351463 7508 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: I0313 10:36:01.351471 7508 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351706 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351732 7508 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351737 7508 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351742 7508 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:36:01.352750 master-0 kubenswrapper[7508]: W0313 10:36:01.351746 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351752 7508 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351758 7508 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351762 7508 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351767 7508 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351771 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351790 7508 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351795 7508 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351800 7508 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351827 7508 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351836 7508 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351840 7508 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351864 7508 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351868 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351876 7508 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351880 7508 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351884 7508 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351888 7508 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351891 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:36:01.353214 master-0 kubenswrapper[7508]: W0313 10:36:01.351895 7508 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351899 7508 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351917 7508 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351921 7508 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351924 7508 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351928 7508 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351931 7508 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351949 7508 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351953 7508 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351957 7508 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351961 7508 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351965 7508 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351969 7508 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351973 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351993 7508 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.351998 7508 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.352003 7508 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.352007 7508 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.352010 7508 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:36:01.353761 master-0 kubenswrapper[7508]: W0313 10:36:01.352014 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352017 7508 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352033 7508 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352038 7508 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352042 7508 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352045 7508 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352049 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352054 7508 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352057 7508 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352061 7508 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352065 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352068 7508 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352072 7508 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352089 7508 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352114 7508 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352119 7508 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352123 7508 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352127 7508 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352131 7508 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352135 7508 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:36:01.354367 master-0 kubenswrapper[7508]: W0313 10:36:01.352139 7508 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352144 7508 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352149 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352153 7508 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352157 7508 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352160 7508 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352164 7508 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352169 7508 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352188 7508 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: W0313 10:36:01.352267 7508 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: I0313 10:36:01.352284 7508 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:36:01.354934 master-0 kubenswrapper[7508]: I0313 10:36:01.352723 7508 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 10:36:01.356424 master-0 kubenswrapper[7508]: I0313 10:36:01.356395 7508 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 13 10:36:01.356908 master-0 kubenswrapper[7508]: I0313 10:36:01.356804 7508 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 10:36:01.357538 master-0 kubenswrapper[7508]: I0313 10:36:01.357513 7508 server.go:997] "Starting client certificate rotation" Mar 13 10:36:01.357626 master-0 kubenswrapper[7508]: I0313 10:36:01.357605 7508 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 10:36:01.357901 master-0 kubenswrapper[7508]: I0313 10:36:01.357801 7508 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 07:45:51.390909692 +0000 UTC Mar 13 10:36:01.357952 master-0 kubenswrapper[7508]: I0313 10:36:01.357899 7508 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 21h9m50.033013741s for next certificate rotation Mar 13 10:36:01.359624 master-0 kubenswrapper[7508]: I0313 10:36:01.359597 7508 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:36:01.363700 master-0 kubenswrapper[7508]: I0313 10:36:01.363667 7508 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:36:01.372762 master-0 kubenswrapper[7508]: I0313 10:36:01.372727 7508 log.go:25] "Validated CRI v1 runtime API" Mar 13 10:36:01.376573 master-0 kubenswrapper[7508]: I0313 10:36:01.376553 7508 log.go:25] "Validated CRI v1 image API" Mar 13 10:36:01.378561 master-0 kubenswrapper[7508]: I0313 10:36:01.378530 7508 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 10:36:01.389654 master-0 kubenswrapper[7508]: I0313 10:36:01.389548 7508 fs.go:135] Filesystem UUIDs: map[58e57e2d-ae5b-4324-bfe8-6d8d8bd04e58:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 13 10:36:01.389990 master-0 kubenswrapper[7508]: I0313 10:36:01.389605 7508 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm major:0 minor:253 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm major:0 minor:245 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0/userdata/shm major:0 minor:42 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h:{mountpoint:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7:{mountpoint:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7 major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd:{mountpoint:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs:{mountpoint:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv:{mountpoint:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8:{mountpoint:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z:{mountpoint:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp:{mountpoint:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw:{mountpoint:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv:{mountpoint:/var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj:{mountpoint:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs:{mountpoint:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9:{mountpoint:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9 major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7:{mountpoint:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq:{mountpoint:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88:{mountpoint:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88 major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b04498f0-5a3f-4461-aecb-50304662d854/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b04498f0-5a3f-4461-aecb-50304662d854/volumes/kubernetes.io~projected/kube-api-access major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8:{mountpoint:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf:{mountpoint:/var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8:{mountpoint:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8 major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq:{mountpoint:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768:{mountpoint:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t:{mountpoint:/var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2:{mountpoint:/var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2 major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k:{mountpoint:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2:{mountpoint:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/51ee4ddf1cec0b7db149f561d502a0ed68d3f058aff373243fa18c6e50637da7/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/a2b25121abd050ee1077647cd54f1731a5d154c9e8827bc213a663b0943aba5f/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-113:{mountpoint:/var/lib/containers/storage/overlay/a8fcb64b148de985b6fcfa23ea0772cc70ab98c1598002459bc9c2e3bf6a41e0/merged major:0 minor:113 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/d454cda786f8b4e7e94db42bc98005f38a20d8041248a399f447b9403208c68d/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/5d1950ddba4d37696d6e2782d7bfa3f829e8c0a5bd53b7dff381b344e4009ec7/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/9177e38a4c08bbb28fbc771f9305d06e7095d26e22fcc940217219997aaafba2/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/13d6ea04e1be1e67e086be3e3dd0033fd79232507de8553b7441b55ebf1985ee/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/5da52f1bda4e856534bf0ff70330e601aebebcc75c60b5513f5899f743a6feb9/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/6d5b96f8d256abc5e49d0b929a834320335035114d9c04dfe2c43b427cfc3aaf/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/81d7e40c300581ffe0b218130f8fff3774f3e8e736ede1c1fa2065161c9d32bf/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/737e9c58faee00c4c2872b97e8b290fc78c01720c0546498e61fa21b975b3156/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/9868f778aac90102362a64b2880fbe5a146631be470892b8fac2a1d11fc7cd59/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/775bc6e4bf4a91cd67e389f04bcc2ade69486049c8e0096fafe40d235e2468e4/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/b384c42209d804b06b6915392cc8a22c18073f1a4ceffe284fa3e224021c6be2/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-171:{mountpoint:/var/lib/containers/storage/overlay/6eaa2cbee3e1d380f878b1ac04b4c710667081431b9d5c003e02b8fd12f48952/merged major:0 minor:171 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/43dc212ad089f6ff2917c951f147af293821268e11330d15db5a91fb9343271e/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/5e5885bfaca2d8c1f291e909673bacbb8cc59606cee3b2e9ed301be872fd0fb4/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6d36a3eb9028da7f760341e60345bbf4023ec073aceb9dd458926d24593c6f4e/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/5d8122c338f3daedb4bb19657253f8795a9753bbf6e3019bf7fec9d3fcc8a2f4/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/b950239c02165cf8a57c5cc1570c7f4b09d60d2cf003641ca7dea33e126ff115/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/9d563e4afb673a69a96b12b27d43ad8a0bb745b536e0716838cab0d12fcd9c59/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/8de371efd96578ccc939d25d4b30c41156f73a21bd0c65812508317c599aae3d/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/f899e18fa8dbf3319e0f2fa72649691ff5ab04c9a826218dc89a8718b8b3b1cb/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ab26f40fd99753f5bd3fafeb100b829d22e99dd20f3d1152feb0a088c628c3ac/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/3c13b3b5f7fe1c6a8072b8566f3bb234441deefc87e04a62055d0a816d37f30d/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/259115946e4112665a27b644eb3b91bee2b22821ed4347acdf601e6960a0224f/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/0fb4682fb6a43f9080f194d26b5048f3223f37b986c910a3620d951d23c75d68/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/653aa83583d1b46bc3b7b604338b291c9dce41984a7e631867276167378628ef/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/3e085f9531fbb969a97a96d7bfaa906838dfd5273709b75371d1e0c43db1b2e5/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/6054552f2c76bbf24cd482c1b3f5dbe5856cfbebba509a62813070a37faccbbc/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/77ad5fbff258647a532dc461679dc56b9f856c17aecd286d3b6bdce623541d6b/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/c71d16d546cb0676dbd656400e4e52aca6a9e7fe8800ec2c312f5dcffa93fd43/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/4417774e3b57d233551490f3238b8b38fada92b33064129303e9a293bb033c05/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-299:{mountpoint:/var/lib/containers/storage/overlay/ba49858213f3d34b64eab9646ecaf3299dab30c4538bcc9f9c03d138ed656bf2/merged major:0 minor:299 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/eb828c4a888ea55c86f7686c5e47943027d16f2f7a09d4a5910a4a16b93beb52/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/0c2aa20999f4d781eca6400565a8bda800ae591a1d0c028161bc68f914999c35/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-44:{mountpoint:/var/lib/containers/storage/overlay/2a88d10e550342b2716cfe3bcbe9dad3557281869537887fcf161c521c5f73c1/merged major:0 minor:44 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/340e762acad6d9554da3689c3ffe108a164ba1d63d01fc8066cf2e2cc887f3f1/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/24dca11bfc7ea37f221888024b9a1eb036ef0079177c980e95515f4c4fd8195b/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/37f654fdaa2b47a3debf99648019d42f4a24d52e85cbd5a0ff44eb9b22b3cf09/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/01bc2fe9632e5a08a52a2de09638c6788928d404549a020ac4cd4a493eb03249/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/1d2332ebd7f7d504f72f22acb72248200ba7a5095a08d8566f7eeb8e43d67fe1/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/80e4407c22df390bcefe378a55fbc96f63c9d92d0d8edbada6a4121431135a77/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/3757136ffedaa89354ba98a725348a61a9e74b5d15e2a2257bc243912ac71fd5/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/88f26bd056a33b1091c15b0340cf64c6a429e29a6377ace03a1af4957ff3b366/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/4f30e221c8651c19909c94969aa7a143f37683db49bd6788ba7a86e900209f05/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-78:{mountpoint:/var/lib/containers/storage/overlay/b255385eb8d10c6a7b6ad55eac429c21b9b8bb8f0943f184a7c8a2812be0ebb3/merged major:0 minor:78 fsType:overlay blockSize:0} overlay_0-89:{mountpoint:/var/lib/containers/storage/overlay/94c12b29db9922112d9383468bc07184287e650014119547feb6accb63871e79/merged major:0 minor:89 fsType:overlay blockSize:0}] Mar 13 10:36:01.428327 master-0 kubenswrapper[7508]: I0313 10:36:01.426941 7508 manager.go:217] Machine: {Timestamp:2026-03-13 10:36:01.425136125 +0000 UTC m=+0.167961312 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3a49dbefec214e87acca6e8120215b7b SystemUUID:3a49dbef-ec21-4e87-acca-6e8120215b7b BootID:794a19f0-76ba-45e8-ae39-0211fb872ab6 Filesystems:[{Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-113 DeviceMajor:0 DeviceMinor:113 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-44 DeviceMajor:0 DeviceMinor:44 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8 DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-299 DeviceMajor:0 DeviceMinor:299 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7 DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b04498f0-5a3f-4461-aecb-50304662d854/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2 DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-171 DeviceMajor:0 DeviceMinor:171 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm DeviceMajor:0 DeviceMinor:253 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7 DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-78 DeviceMajor:0 DeviceMinor:78 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm DeviceMajor:0 DeviceMinor:245 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf DeviceMajor:0 DeviceMinor:115 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88 DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8 DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9 DeviceMajor:0 DeviceMinor:103 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2 DeviceMajor:0 DeviceMinor:104 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0/userdata/shm DeviceMajor:0 DeviceMinor:42 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-89 DeviceMajor:0 DeviceMinor:89 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0731faf1ccc38c5 MacAddress:5a:eb:2c:de:95:9c Speed:10000 Mtu:8900} {Name:09bada5ccab47e8 MacAddress:8a:b6:92:88:c5:00 Speed:10000 Mtu:8900} {Name:3570848357e5506 MacAddress:66:b0:e4:67:90:d5 Speed:10000 Mtu:8900} {Name:362b488b60e500e MacAddress:ae:2a:c8:e6:3c:3d Speed:10000 Mtu:8900} {Name:7502f9cc62ba09f MacAddress:5e:9f:92:3a:40:86 Speed:10000 Mtu:8900} {Name:7d64d717a487ab9 MacAddress:56:96:e5:8b:cb:15 Speed:10000 Mtu:8900} {Name:7d8988c40bcb4c1 MacAddress:d2:c3:5c:4a:13:33 Speed:10000 Mtu:8900} {Name:83984d61bee36a6 MacAddress:6a:ee:1f:75:29:26 Speed:10000 Mtu:8900} {Name:a4d11bdc39191c7 MacAddress:d6:b1:9a:18:b0:56 Speed:10000 Mtu:8900} {Name:aba1a9619c2284c MacAddress:52:4e:6f:96:67:56 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:f6:ce:79:0c:e6:f0 Speed:0 Mtu:8900} {Name:da062cae7ba3072 MacAddress:ce:ed:e0:2d:91:4b Speed:10000 Mtu:8900} {Name:e91ae8a44c4b4ac MacAddress:8e:e7:5f:88:ad:d5 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:35:d5:aa Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:96:ef:67:d7:01:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 10:36:01.428327 master-0 kubenswrapper[7508]: I0313 10:36:01.428276 7508 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 10:36:01.429601 master-0 kubenswrapper[7508]: I0313 10:36:01.428629 7508 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 10:36:01.429601 master-0 kubenswrapper[7508]: I0313 10:36:01.429466 7508 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 10:36:01.429993 master-0 kubenswrapper[7508]: I0313 10:36:01.429886 7508 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 10:36:01.430516 master-0 kubenswrapper[7508]: I0313 10:36:01.429981 7508 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 10:36:01.430640 master-0 kubenswrapper[7508]: I0313 10:36:01.430590 7508 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 10:36:01.430640 master-0 kubenswrapper[7508]: I0313 10:36:01.430612 7508 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 10:36:01.430750 master-0 kubenswrapper[7508]: I0313 10:36:01.430646 7508 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:36:01.430750 master-0 kubenswrapper[7508]: I0313 10:36:01.430713 7508 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:36:01.431036 master-0 kubenswrapper[7508]: I0313 10:36:01.430999 7508 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:36:01.431277 master-0 kubenswrapper[7508]: I0313 10:36:01.431200 7508 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 10:36:01.431417 master-0 kubenswrapper[7508]: I0313 10:36:01.431386 7508 kubelet.go:418] "Attempting to sync node with API server" Mar 13 10:36:01.431417 master-0 kubenswrapper[7508]: I0313 10:36:01.431416 7508 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 10:36:01.431557 master-0 kubenswrapper[7508]: I0313 10:36:01.431496 7508 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 10:36:01.431673 master-0 kubenswrapper[7508]: I0313 10:36:01.431581 7508 kubelet.go:324] "Adding apiserver pod source" Mar 13 10:36:01.431963 master-0 kubenswrapper[7508]: I0313 10:36:01.431872 7508 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 10:36:01.440852 master-0 kubenswrapper[7508]: I0313 10:36:01.440435 7508 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 10:36:01.440852 master-0 kubenswrapper[7508]: I0313 10:36:01.440641 7508 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 10:36:01.441151 master-0 kubenswrapper[7508]: I0313 10:36:01.440935 7508 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 10:36:01.441151 master-0 kubenswrapper[7508]: I0313 10:36:01.441128 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 10:36:01.441151 master-0 kubenswrapper[7508]: I0313 10:36:01.441144 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 10:36:01.441151 master-0 kubenswrapper[7508]: I0313 10:36:01.441152 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441158 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441165 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441171 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441178 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441184 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441192 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441198 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441207 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441218 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 10:36:01.441351 master-0 kubenswrapper[7508]: I0313 10:36:01.441254 7508 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 10:36:01.441753 master-0 kubenswrapper[7508]: I0313 10:36:01.441640 7508 server.go:1280] "Started kubelet" Mar 13 10:36:01.443696 master-0 kubenswrapper[7508]: I0313 10:36:01.442119 7508 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 10:36:01.443696 master-0 kubenswrapper[7508]: I0313 10:36:01.442204 7508 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 10:36:01.443696 master-0 kubenswrapper[7508]: I0313 10:36:01.442331 7508 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 10:36:01.443696 master-0 kubenswrapper[7508]: I0313 10:36:01.442894 7508 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 10:36:01.443696 master-0 kubenswrapper[7508]: I0313 10:36:01.443269 7508 server.go:449] "Adding debug handlers to kubelet server" Mar 13 10:36:01.442747 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 10:36:01.445357 master-0 kubenswrapper[7508]: I0313 10:36:01.445318 7508 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 10:36:01.445357 master-0 kubenswrapper[7508]: I0313 10:36:01.445356 7508 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 10:36:01.445547 master-0 kubenswrapper[7508]: I0313 10:36:01.445367 7508 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 04:30:10.133062462 +0000 UTC Mar 13 10:36:01.445547 master-0 kubenswrapper[7508]: I0313 10:36:01.445394 7508 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 17h54m8.687670079s for next certificate rotation Mar 13 10:36:01.446588 master-0 kubenswrapper[7508]: I0313 10:36:01.446539 7508 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 10:36:01.446588 master-0 kubenswrapper[7508]: I0313 10:36:01.446556 7508 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 10:36:01.447740 master-0 kubenswrapper[7508]: I0313 10:36:01.446648 7508 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 10:36:01.447740 master-0 kubenswrapper[7508]: I0313 10:36:01.447676 7508 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:36:01.450517 master-0 kubenswrapper[7508]: I0313 10:36:01.448737 7508 factory.go:55] Registering systemd factory Mar 13 10:36:01.450517 master-0 kubenswrapper[7508]: I0313 10:36:01.448820 7508 factory.go:221] Registration of the systemd container factory successfully Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452614 7508 factory.go:153] Registering CRI-O factory Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452645 7508 factory.go:221] Registration of the crio container factory successfully Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452831 7508 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452901 7508 factory.go:103] Registering Raw factory Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452935 7508 manager.go:1196] Started watching for new ooms in manager Mar 13 10:36:01.453071 master-0 kubenswrapper[7508]: I0313 10:36:01.452929 7508 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:36:01.453863 master-0 kubenswrapper[7508]: I0313 10:36:01.453807 7508 manager.go:319] Starting recovery of all containers Mar 13 10:36:01.458159 master-0 kubenswrapper[7508]: I0313 10:36:01.458109 7508 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:36:01.464425 master-0 kubenswrapper[7508]: I0313 10:36:01.464075 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config" seLinuxMountContext="" Mar 13 10:36:01.464425 master-0 kubenswrapper[7508]: I0313 10:36:01.464408 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d2fdba3-9478-4165-9207-d01483625607" volumeName="kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464444 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b04498f0-5a3f-4461-aecb-50304662d854" volumeName="kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464467 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464480 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464492 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464512 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464526 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464541 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464550 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464596 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a13f3e08-2b67-404f-8695-77aa17f92137" volumeName="kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464615 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464627 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides" seLinuxMountContext="" Mar 13 10:36:01.464779 master-0 kubenswrapper[7508]: I0313 10:36:01.464668 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464798 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464850 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464875 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464891 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464903 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464915 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58685de6-b4ae-4229-870b-5143a6010450" volumeName="kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464927 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464940 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b04498f0-5a3f-4461-aecb-50304662d854" volumeName="kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464953 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ef32245-c238-43c6-a57a-a5ac95aff1f7" volumeName="kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464965 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464979 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8df2728b-4f21-4aef-b31f-4197bbcd2728" volumeName="kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.464991 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465008 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465020 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465032 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="024d9bd3-ac77-4257-9808-7518f2a73e11" volumeName="kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465044 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465137 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7d31378-e940-4473-ab37-10f250c76666" volumeName="kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465157 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465169 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465191 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465204 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465216 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b956d3-c046-4f26-8be2-718c165a3acc" volumeName="kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465226 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465237 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465248 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465258 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465269 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465311 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03b97fde-467c-46f0-95f9-9c3820b4d790" volumeName="kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465333 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b956d3-c046-4f26-8be2-718c165a3acc" volumeName="kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465398 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465415 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465429 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465448 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465459 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465469 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465479 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465491 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides" seLinuxMountContext="" Mar 13 10:36:01.465447 master-0 kubenswrapper[7508]: I0313 10:36:01.465503 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465521 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465534 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465546 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465558 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465573 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58685de6-b4ae-4229-870b-5143a6010450" volumeName="kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465585 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465599 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465609 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465668 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465682 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465695 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ef32245-c238-43c6-a57a-a5ac95aff1f7" volumeName="kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465707 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465720 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465731 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465748 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465760 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465771 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465783 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465799 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465811 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465824 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465836 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d2fdba3-9478-4165-9207-d01483625607" volumeName="kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465848 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465859 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465871 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e87ca16c-25de-4fea-b900-2960f4a5f95e" volumeName="kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465882 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465928 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465944 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5da919b6-8545-4001-89f3-74cb289327f0" volumeName="kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465957 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465968 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465982 7508 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token" seLinuxMountContext="" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.465996 7508 reconstruct.go:97] "Volume reconstruction finished" Mar 13 10:36:01.467286 master-0 kubenswrapper[7508]: I0313 10:36:01.466005 7508 reconciler.go:26] "Reconciler: start to sync state" Mar 13 10:36:01.470161 master-0 kubenswrapper[7508]: I0313 10:36:01.470025 7508 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 10:36:01.496261 master-0 kubenswrapper[7508]: I0313 10:36:01.496172 7508 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 10:36:01.498367 master-0 kubenswrapper[7508]: I0313 10:36:01.498339 7508 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 10:36:01.498463 master-0 kubenswrapper[7508]: I0313 10:36:01.498435 7508 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 10:36:01.498463 master-0 kubenswrapper[7508]: I0313 10:36:01.498461 7508 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 10:36:01.498615 master-0 kubenswrapper[7508]: E0313 10:36:01.498503 7508 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 10:36:01.501421 master-0 kubenswrapper[7508]: I0313 10:36:01.501346 7508 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:36:01.509990 master-0 kubenswrapper[7508]: I0313 10:36:01.509932 7508 generic.go:334] "Generic (PLEG): container finished" podID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerID="39e3998474ffa5421ada785b69659b745abc434915dc0302700b2f60923ba978" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523452 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ce2b6ceda0b8c8212b1e35589d611accb6e40391c87b39cfb64f98a22b7e5dda" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523506 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c08b2c581358381ac2f0c793ddf6295e272c0061c1b2d6e05d6e5ab7c2a5729b" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523517 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ffc23a177a087ad146cddc2bc253947b08886f41c707f8ee47efc6dd4d3c5c8e" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523528 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="dc84ce423f666bcd523a540ff225040b69d4425d2faf8d523c79672591bd3375" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523537 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c0bf4ee121253f4acc846c62a0fe4a189d6104b07034617c1152a5f95507935c" exitCode=0 Mar 13 10:36:01.523530 master-0 kubenswrapper[7508]: I0313 10:36:01.523546 7508 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf" exitCode=0 Mar 13 10:36:01.525500 master-0 kubenswrapper[7508]: I0313 10:36:01.525432 7508 generic.go:334] "Generic (PLEG): container finished" podID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerID="34b7e36b0204fb75f5eaa9ffadb1e13d0888ef1773ea6fc2201df90d0a2dcd5e" exitCode=0 Mar 13 10:36:01.544808 master-0 kubenswrapper[7508]: I0313 10:36:01.544757 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 10:36:01.545229 master-0 kubenswrapper[7508]: I0313 10:36:01.545188 7508 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" exitCode=1 Mar 13 10:36:01.545338 master-0 kubenswrapper[7508]: I0313 10:36:01.545232 7508 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6" exitCode=0 Mar 13 10:36:01.558028 master-0 kubenswrapper[7508]: I0313 10:36:01.557955 7508 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1" exitCode=0 Mar 13 10:36:01.567262 master-0 kubenswrapper[7508]: I0313 10:36:01.567202 7508 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34" exitCode=1 Mar 13 10:36:01.574373 master-0 kubenswrapper[7508]: I0313 10:36:01.574319 7508 generic.go:334] "Generic (PLEG): container finished" podID="fb060653-0d4b-4759-a7a1-c5dce194cce7" containerID="f741ec84eccfaea3008e82066654cae2f174abb120ece50ffb0345c3a6b62422" exitCode=0 Mar 13 10:36:01.590409 master-0 kubenswrapper[7508]: I0313 10:36:01.590035 7508 manager.go:324] Recovery completed Mar 13 10:36:01.598851 master-0 kubenswrapper[7508]: E0313 10:36:01.598796 7508 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:36:01.631020 master-0 kubenswrapper[7508]: I0313 10:36:01.630958 7508 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 10:36:01.631020 master-0 kubenswrapper[7508]: I0313 10:36:01.630992 7508 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 10:36:01.631312 master-0 kubenswrapper[7508]: I0313 10:36:01.631046 7508 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:36:01.631371 master-0 kubenswrapper[7508]: I0313 10:36:01.631323 7508 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 10:36:01.631371 master-0 kubenswrapper[7508]: I0313 10:36:01.631339 7508 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 10:36:01.631444 master-0 kubenswrapper[7508]: I0313 10:36:01.631378 7508 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 13 10:36:01.631444 master-0 kubenswrapper[7508]: I0313 10:36:01.631388 7508 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 13 10:36:01.631444 master-0 kubenswrapper[7508]: I0313 10:36:01.631416 7508 policy_none.go:49] "None policy: Start" Mar 13 10:36:01.634303 master-0 kubenswrapper[7508]: I0313 10:36:01.634272 7508 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 10:36:01.634370 master-0 kubenswrapper[7508]: I0313 10:36:01.634323 7508 state_mem.go:35] "Initializing new in-memory state store" Mar 13 10:36:01.634559 master-0 kubenswrapper[7508]: I0313 10:36:01.634541 7508 state_mem.go:75] "Updated machine memory state" Mar 13 10:36:01.634559 master-0 kubenswrapper[7508]: I0313 10:36:01.634559 7508 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 13 10:36:01.646212 master-0 kubenswrapper[7508]: I0313 10:36:01.646133 7508 manager.go:334] "Starting Device Plugin manager" Mar 13 10:36:01.646212 master-0 kubenswrapper[7508]: I0313 10:36:01.646186 7508 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 10:36:01.646212 master-0 kubenswrapper[7508]: I0313 10:36:01.646204 7508 server.go:79] "Starting device plugin registration server" Mar 13 10:36:01.646635 master-0 kubenswrapper[7508]: I0313 10:36:01.646616 7508 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 10:36:01.646698 master-0 kubenswrapper[7508]: I0313 10:36:01.646638 7508 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 10:36:01.646874 master-0 kubenswrapper[7508]: I0313 10:36:01.646828 7508 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 10:36:01.646987 master-0 kubenswrapper[7508]: I0313 10:36:01.646970 7508 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 10:36:01.646987 master-0 kubenswrapper[7508]: I0313 10:36:01.646981 7508 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 10:36:01.747521 master-0 kubenswrapper[7508]: I0313 10:36:01.747433 7508 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:36:01.749525 master-0 kubenswrapper[7508]: I0313 10:36:01.749482 7508 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:36:01.749525 master-0 kubenswrapper[7508]: I0313 10:36:01.749529 7508 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:36:01.749525 master-0 kubenswrapper[7508]: I0313 10:36:01.749542 7508 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:36:01.749832 master-0 kubenswrapper[7508]: I0313 10:36:01.749589 7508 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:36:01.763379 master-0 kubenswrapper[7508]: I0313 10:36:01.762972 7508 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 13 10:36:01.763379 master-0 kubenswrapper[7508]: I0313 10:36:01.763077 7508 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 10:36:01.799288 master-0 kubenswrapper[7508]: I0313 10:36:01.799191 7508 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800202 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308" Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800233 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e90665783ccbc42369d3a5509b74862f544344287640369b3e630abf2508a1ac" Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800274 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a529a1b4625a9f72952239af4d0dacaabbd3d9fc81025c2ae8ab5074c27ffade" Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800302 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800360 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800377 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8d5872d3df5ae3d0356feb1227762765a592eb87fd4344b9e636b3a3e963fad0"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800388 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800400 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800412 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800425 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"2b53706ef774eb15c126f57be58e4c0c9f005142fd0e9af295b43871ae8de7ef"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800435 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"fc59a335ab92b5426116aa2f5adb31266760392f014df421d723f95bb6f6ebfb"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800445 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"e8e0120bb83ed513c2b33a7406952ffd8039dbde2867bd25a1c4d594e4e7407c"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800457 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800470 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800480 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800491 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800504 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800517 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800531 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34"} Mar 13 10:36:01.805211 master-0 kubenswrapper[7508]: I0313 10:36:01.800541 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84"} Mar 13 10:36:01.871778 master-0 kubenswrapper[7508]: I0313 10:36:01.871731 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.871778 master-0 kubenswrapper[7508]: I0313 10:36:01.871776 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871793 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871808 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871824 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871838 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871851 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871864 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871878 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871892 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871904 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.871939 master-0 kubenswrapper[7508]: I0313 10:36:01.871919 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.872751 master-0 kubenswrapper[7508]: I0313 10:36:01.871932 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.872751 master-0 kubenswrapper[7508]: I0313 10:36:01.871966 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.872751 master-0 kubenswrapper[7508]: I0313 10:36:01.871981 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.872751 master-0 kubenswrapper[7508]: I0313 10:36:01.871996 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.872751 master-0 kubenswrapper[7508]: I0313 10:36:01.872012 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.874818 master-0 kubenswrapper[7508]: E0313 10:36:01.874785 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.875162 master-0 kubenswrapper[7508]: W0313 10:36:01.875131 7508 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 10:36:01.875224 master-0 kubenswrapper[7508]: E0313 10:36:01.875180 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.875355 master-0 kubenswrapper[7508]: E0313 10:36:01.875330 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.875477 master-0 kubenswrapper[7508]: E0313 10:36:01.875429 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972502 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972603 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972644 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972670 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972688 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972709 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972727 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972751 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972769 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972787 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972804 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972822 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972839 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972857 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972875 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972894 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972946 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.972972 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973015 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973065 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973118 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973144 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973166 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973192 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973216 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973237 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973258 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973278 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973304 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973330 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973387 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973416 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973438 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: I0313 10:36:01.973462 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:01.975256 master-0 kubenswrapper[7508]: E0313 10:36:01.974122 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:02.432642 master-0 kubenswrapper[7508]: I0313 10:36:02.432579 7508 apiserver.go:52] "Watching apiserver" Mar 13 10:36:02.448486 master-0 kubenswrapper[7508]: I0313 10:36:02.448433 7508 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:36:02.449376 master-0 kubenswrapper[7508]: I0313 10:36:02.449302 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf","assisted-installer/assisted-installer-controller-k96f8","openshift-config-operator/openshift-config-operator-64488f9d78-pchtd","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j","openshift-multus/multus-additional-cni-plugins-72t2n","openshift-network-operator/iptables-alerter-55t7x","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z","openshift-dns-operator/dns-operator-589895fbb7-6zkqh","openshift-etcd/etcd-master-0-master-0","openshift-ingress-operator/ingress-operator-677db989d6-b2ss8","openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz","openshift-multus/multus-bjv5r","openshift-ovn-kubernetes/ovnkube-node-vww4t","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8","openshift-network-diagnostics/network-check-target-jwfjl","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26","openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq","openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn","kube-system/bootstrap-kube-scheduler-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-marketplace/marketplace-operator-64bf9778cb-4v99n","openshift-network-operator/network-operator-7c649bf6d4-z9wrg","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-multus/multus-admission-controller-8d675b596-6gzxr","openshift-multus/network-metrics-daemon-c5vhc","openshift-network-node-identity/network-node-identity-hkjrg"] Mar 13 10:36:02.450975 master-0 kubenswrapper[7508]: I0313 10:36:02.450651 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.450975 master-0 kubenswrapper[7508]: I0313 10:36:02.450732 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:02.450975 master-0 kubenswrapper[7508]: I0313 10:36:02.450887 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:02.450975 master-0 kubenswrapper[7508]: I0313 10:36:02.450922 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:02.450975 master-0 kubenswrapper[7508]: I0313 10:36:02.450942 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451438 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451564 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451594 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451622 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451632 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.453508 master-0 kubenswrapper[7508]: I0313 10:36:02.451646 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.453799 master-0 kubenswrapper[7508]: I0313 10:36:02.453632 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:36:02.454579 master-0 kubenswrapper[7508]: I0313 10:36:02.453892 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.454579 master-0 kubenswrapper[7508]: I0313 10:36:02.453962 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:36:02.454579 master-0 kubenswrapper[7508]: I0313 10:36:02.454196 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:36:02.456982 master-0 kubenswrapper[7508]: I0313 10:36:02.456942 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:36:02.457088 master-0 kubenswrapper[7508]: I0313 10:36:02.457072 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:36:02.464906 master-0 kubenswrapper[7508]: I0313 10:36:02.457257 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:02.467328 master-0 kubenswrapper[7508]: I0313 10:36:02.467058 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:02.469656 master-0 kubenswrapper[7508]: I0313 10:36:02.467923 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:02.469756 master-0 kubenswrapper[7508]: I0313 10:36:02.469726 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:36:02.469968 master-0 kubenswrapper[7508]: I0313 10:36:02.469946 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.470154 master-0 kubenswrapper[7508]: I0313 10:36:02.470123 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:36:02.470285 master-0 kubenswrapper[7508]: I0313 10:36:02.470265 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:36:02.470447 master-0 kubenswrapper[7508]: I0313 10:36:02.470357 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:36:02.470447 master-0 kubenswrapper[7508]: I0313 10:36:02.470446 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:36:02.471047 master-0 kubenswrapper[7508]: I0313 10:36:02.470801 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:36:02.471546 master-0 kubenswrapper[7508]: I0313 10:36:02.471502 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.473630 master-0 kubenswrapper[7508]: I0313 10:36:02.473578 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.473754 master-0 kubenswrapper[7508]: I0313 10:36:02.473635 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.473887 master-0 kubenswrapper[7508]: I0313 10:36:02.473851 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:36:02.473941 master-0 kubenswrapper[7508]: I0313 10:36:02.473878 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:36:02.473941 master-0 kubenswrapper[7508]: I0313 10:36:02.471513 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:36:02.474130 master-0 kubenswrapper[7508]: I0313 10:36:02.474112 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:36:02.474276 master-0 kubenswrapper[7508]: I0313 10:36:02.474253 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:36:02.474425 master-0 kubenswrapper[7508]: I0313 10:36:02.474382 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:36:02.474488 master-0 kubenswrapper[7508]: I0313 10:36:02.474457 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:36:02.474536 master-0 kubenswrapper[7508]: I0313 10:36:02.474485 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:36:02.474536 master-0 kubenswrapper[7508]: I0313 10:36:02.474508 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:36:02.474536 master-0 kubenswrapper[7508]: I0313 10:36:02.471680 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:36:02.474536 master-0 kubenswrapper[7508]: I0313 10:36:02.474406 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:36:02.474693 master-0 kubenswrapper[7508]: I0313 10:36:02.474554 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:36:02.474693 master-0 kubenswrapper[7508]: I0313 10:36:02.472024 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472044 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472084 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472173 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472226 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472631 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472701 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472744 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472924 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.472958 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.473004 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.473032 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:36:02.474792 master-0 kubenswrapper[7508]: I0313 10:36:02.471780 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:36:02.475190 master-0 kubenswrapper[7508]: I0313 10:36:02.474877 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:36:02.475190 master-0 kubenswrapper[7508]: I0313 10:36:02.475055 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:36:02.475190 master-0 kubenswrapper[7508]: I0313 10:36:02.475155 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:36:02.477004 master-0 kubenswrapper[7508]: I0313 10:36:02.476968 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:36:02.477067 master-0 kubenswrapper[7508]: I0313 10:36:02.477042 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:36:02.477067 master-0 kubenswrapper[7508]: I0313 10:36:02.477049 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:36:02.477178 master-0 kubenswrapper[7508]: I0313 10:36:02.477147 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:36:02.477225 master-0 kubenswrapper[7508]: I0313 10:36:02.477196 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:36:02.477327 master-0 kubenswrapper[7508]: I0313 10:36:02.477303 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.477448 master-0 kubenswrapper[7508]: I0313 10:36:02.477406 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:36:02.477616 master-0 kubenswrapper[7508]: I0313 10:36:02.477590 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:36:02.477647 master-0 kubenswrapper[7508]: I0313 10:36:02.477627 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:36:02.477672 master-0 kubenswrapper[7508]: I0313 10:36:02.477658 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:36:02.477713 master-0 kubenswrapper[7508]: I0313 10:36:02.477695 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:36:02.477746 master-0 kubenswrapper[7508]: I0313 10:36:02.477711 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:36:02.477780 master-0 kubenswrapper[7508]: I0313 10:36:02.477744 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.477814 master-0 kubenswrapper[7508]: I0313 10:36:02.477799 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:36:02.477840 master-0 kubenswrapper[7508]: I0313 10:36:02.477818 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:36:02.477919 master-0 kubenswrapper[7508]: I0313 10:36:02.477901 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:36:02.477957 master-0 kubenswrapper[7508]: I0313 10:36:02.477915 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:36:02.477987 master-0 kubenswrapper[7508]: I0313 10:36:02.477599 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:36:02.478013 master-0 kubenswrapper[7508]: I0313 10:36:02.477982 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:36:02.478071 master-0 kubenswrapper[7508]: I0313 10:36:02.478056 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:36:02.478123 master-0 kubenswrapper[7508]: I0313 10:36:02.478076 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:36:02.478162 master-0 kubenswrapper[7508]: I0313 10:36:02.478135 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:36:02.478224 master-0 kubenswrapper[7508]: I0313 10:36:02.478207 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:36:02.478256 master-0 kubenswrapper[7508]: I0313 10:36:02.478223 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:36:02.478294 master-0 kubenswrapper[7508]: I0313 10:36:02.478259 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:36:02.478340 master-0 kubenswrapper[7508]: I0313 10:36:02.478326 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:36:02.478375 master-0 kubenswrapper[7508]: I0313 10:36:02.478356 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:36:02.478762 master-0 kubenswrapper[7508]: I0313 10:36:02.478736 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:36:02.479044 master-0 kubenswrapper[7508]: I0313 10:36:02.479025 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:36:02.479209 master-0 kubenswrapper[7508]: I0313 10:36:02.479109 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:36:02.479209 master-0 kubenswrapper[7508]: I0313 10:36:02.479162 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.479376 master-0 kubenswrapper[7508]: I0313 10:36:02.479344 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:36:02.479769 master-0 kubenswrapper[7508]: I0313 10:36:02.479718 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:36:02.479769 master-0 kubenswrapper[7508]: I0313 10:36:02.479754 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:36:02.480059 master-0 kubenswrapper[7508]: I0313 10:36:02.480029 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:36:02.480211 master-0 kubenswrapper[7508]: I0313 10:36:02.480189 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:36:02.480324 master-0 kubenswrapper[7508]: I0313 10:36:02.480282 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:36:02.480534 master-0 kubenswrapper[7508]: I0313 10:36:02.480505 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:36:02.480631 master-0 kubenswrapper[7508]: I0313 10:36:02.480611 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:36:02.480809 master-0 kubenswrapper[7508]: I0313 10:36:02.480781 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:36:02.480982 master-0 kubenswrapper[7508]: I0313 10:36:02.480961 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:36:02.481226 master-0 kubenswrapper[7508]: I0313 10:36:02.481207 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:36:02.486559 master-0 kubenswrapper[7508]: I0313 10:36:02.484740 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:36:02.490807 master-0 kubenswrapper[7508]: I0313 10:36:02.490765 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:36:02.491285 master-0 kubenswrapper[7508]: I0313 10:36:02.491251 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:36:02.491538 master-0 kubenswrapper[7508]: I0313 10:36:02.491489 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:36:02.492227 master-0 kubenswrapper[7508]: I0313 10:36:02.492200 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.494525 master-0 kubenswrapper[7508]: I0313 10:36:02.494502 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:36:02.496486 master-0 kubenswrapper[7508]: I0313 10:36:02.494827 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:36:02.496486 master-0 kubenswrapper[7508]: I0313 10:36:02.495297 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:36:02.496486 master-0 kubenswrapper[7508]: I0313 10:36:02.496320 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:36:02.498674 master-0 kubenswrapper[7508]: I0313 10:36:02.498622 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:36:02.499331 master-0 kubenswrapper[7508]: I0313 10:36:02.499299 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:36:02.500932 master-0 kubenswrapper[7508]: I0313 10:36:02.500902 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:36:02.503071 master-0 kubenswrapper[7508]: I0313 10:36:02.503039 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:36:02.514491 master-0 kubenswrapper[7508]: I0313 10:36:02.514447 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:36:02.534488 master-0 kubenswrapper[7508]: I0313 10:36:02.534438 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:36:02.549408 master-0 kubenswrapper[7508]: I0313 10:36:02.549363 7508 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 10:36:02.556058 master-0 kubenswrapper[7508]: I0313 10:36:02.556009 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:36:02.575537 master-0 kubenswrapper[7508]: I0313 10:36:02.575485 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:36:02.576641 master-0 kubenswrapper[7508]: I0313 10:36:02.576599 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:02.576712 master-0 kubenswrapper[7508]: I0313 10:36:02.576640 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.576712 master-0 kubenswrapper[7508]: I0313 10:36:02.576668 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.576712 master-0 kubenswrapper[7508]: I0313 10:36:02.576694 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.576873 master-0 kubenswrapper[7508]: I0313 10:36:02.576717 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.577343 master-0 kubenswrapper[7508]: I0313 10:36:02.577314 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.577389 master-0 kubenswrapper[7508]: I0313 10:36:02.577349 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.577389 master-0 kubenswrapper[7508]: I0313 10:36:02.577313 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:02.577474 master-0 kubenswrapper[7508]: I0313 10:36:02.577401 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.577474 master-0 kubenswrapper[7508]: I0313 10:36:02.577433 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.577537 master-0 kubenswrapper[7508]: I0313 10:36:02.577489 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.577537 master-0 kubenswrapper[7508]: I0313 10:36:02.577509 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.577537 master-0 kubenswrapper[7508]: I0313 10:36:02.577528 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:02.577612 master-0 kubenswrapper[7508]: I0313 10:36:02.577530 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.577612 master-0 kubenswrapper[7508]: I0313 10:36:02.577545 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.577612 master-0 kubenswrapper[7508]: I0313 10:36:02.577609 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.577696 master-0 kubenswrapper[7508]: I0313 10:36:02.577631 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:02.577696 master-0 kubenswrapper[7508]: I0313 10:36:02.577675 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.577696 master-0 kubenswrapper[7508]: I0313 10:36:02.577691 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.577769 master-0 kubenswrapper[7508]: I0313 10:36:02.577753 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.577851 master-0 kubenswrapper[7508]: I0313 10:36:02.577829 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.577882 master-0 kubenswrapper[7508]: I0313 10:36:02.577858 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:02.577977 master-0 kubenswrapper[7508]: I0313 10:36:02.577937 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.578009 master-0 kubenswrapper[7508]: I0313 10:36:02.577944 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.578086 master-0 kubenswrapper[7508]: I0313 10:36:02.578049 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.578152 master-0 kubenswrapper[7508]: I0313 10:36:02.578121 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:02.578152 master-0 kubenswrapper[7508]: I0313 10:36:02.578124 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.578224 master-0 kubenswrapper[7508]: I0313 10:36:02.578189 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.578224 master-0 kubenswrapper[7508]: I0313 10:36:02.578214 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.578310 master-0 kubenswrapper[7508]: I0313 10:36:02.578260 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:36:02.578353 master-0 kubenswrapper[7508]: I0313 10:36:02.578310 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.578353 master-0 kubenswrapper[7508]: I0313 10:36:02.578270 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:02.578524 master-0 kubenswrapper[7508]: I0313 10:36:02.578488 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.578592 master-0 kubenswrapper[7508]: I0313 10:36:02.578539 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:02.578592 master-0 kubenswrapper[7508]: I0313 10:36:02.578566 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.578592 master-0 kubenswrapper[7508]: I0313 10:36:02.578584 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578599 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578733 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578739 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578789 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578809 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578845 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578863 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578880 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578888 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578913 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578930 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.578977 master-0 kubenswrapper[7508]: I0313 10:36:02.578966 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579002 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579019 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579036 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579043 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579065 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579082 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579115 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579135 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579152 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579169 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579202 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579221 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579238 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579239 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579279 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579303 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579322 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:02.579363 master-0 kubenswrapper[7508]: I0313 10:36:02.579359 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579382 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579399 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579430 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579534 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579928 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.579991 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.580007 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.580046 master-0 kubenswrapper[7508]: I0313 10:36:02.580037 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580057 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580058 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580074 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580119 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580166 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580210 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580234 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580249 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580281 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580315 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580347 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580384 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:02.580420 master-0 kubenswrapper[7508]: I0313 10:36:02.580429 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580461 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580492 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580524 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580561 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580577 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580597 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580635 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580690 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580725 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580756 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.580879 master-0 kubenswrapper[7508]: I0313 10:36:02.580834 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.580894 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.580913 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.580949 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.580966 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.580982 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581024 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581043 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581060 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581133 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581277 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:02.581326 master-0 kubenswrapper[7508]: I0313 10:36:02.581303 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581349 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581272 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581400 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581474 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581548 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581569 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581574 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581572 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581587 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581636 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581666 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581718 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581839 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581858 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:02.581861 master-0 kubenswrapper[7508]: I0313 10:36:02.581860 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.581884 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.581887 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.581902 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.581973 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582011 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582055 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582133 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582165 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582235 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582247 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582267 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582302 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582375 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582403 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:02.582527 master-0 kubenswrapper[7508]: I0313 10:36:02.582452 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582567 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582587 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582667 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582733 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582785 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582836 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582881 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582901 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582921 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582938 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582955 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.582970 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583011 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583051 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583076 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583081 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583093 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583132 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583136 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583140 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583157 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:02.583188 master-0 kubenswrapper[7508]: I0313 10:36:02.583186 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583223 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583267 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583275 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583304 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583330 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583359 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583397 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583435 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583478 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583485 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583539 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583614 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583662 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583731 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583767 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583768 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583785 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583852 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583881 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:02.583965 master-0 kubenswrapper[7508]: I0313 10:36:02.583938 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.584683 master-0 kubenswrapper[7508]: I0313 10:36:02.584003 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.584683 master-0 kubenswrapper[7508]: I0313 10:36:02.584185 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.584683 master-0 kubenswrapper[7508]: I0313 10:36:02.584234 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.595874 master-0 kubenswrapper[7508]: I0313 10:36:02.595826 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:36:02.646618 master-0 kubenswrapper[7508]: I0313 10:36:02.646538 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.677364 master-0 kubenswrapper[7508]: I0313 10:36:02.677302 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:36:02.683947 master-0 kubenswrapper[7508]: E0313 10:36:02.683870 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:02.684516 master-0 kubenswrapper[7508]: I0313 10:36:02.684488 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.684582 master-0 kubenswrapper[7508]: I0313 10:36:02.684522 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.684582 master-0 kubenswrapper[7508]: I0313 10:36:02.684554 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.684635 master-0 kubenswrapper[7508]: I0313 10:36:02.684589 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:02.684675 master-0 kubenswrapper[7508]: I0313 10:36:02.684644 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:02.684712 master-0 kubenswrapper[7508]: I0313 10:36:02.684678 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.684750 master-0 kubenswrapper[7508]: I0313 10:36:02.684708 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.684750 master-0 kubenswrapper[7508]: E0313 10:36:02.684724 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:02.684846 master-0 kubenswrapper[7508]: E0313 10:36:02.684822 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.184799149 +0000 UTC m=+1.927624266 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:02.684846 master-0 kubenswrapper[7508]: E0313 10:36:02.684823 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:02.684952 master-0 kubenswrapper[7508]: E0313 10:36:02.684852 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.18484553 +0000 UTC m=+1.927670647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:02.684952 master-0 kubenswrapper[7508]: E0313 10:36:02.684891 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:02.684952 master-0 kubenswrapper[7508]: I0313 10:36:02.684900 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.684952 master-0 kubenswrapper[7508]: E0313 10:36:02.684911 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.184904682 +0000 UTC m=+1.927729799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:02.684952 master-0 kubenswrapper[7508]: I0313 10:36:02.684740 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.684963 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.684984 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685004 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685020 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685035 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685054 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685078 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685121 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.685155 master-0 kubenswrapper[7508]: I0313 10:36:02.685143 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685175 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685225 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685257 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685281 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685297 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685316 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685330 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685401 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685430 master-0 kubenswrapper[7508]: I0313 10:36:02.685427 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685486 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: E0313 10:36:02.685500 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685513 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: E0313 10:36:02.685548 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.185531428 +0000 UTC m=+1.928356645 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: E0313 10:36:02.685565 7508 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685570 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: E0313 10:36:02.685589 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.185582399 +0000 UTC m=+1.928407516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685606 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685623 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685648 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685667 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685672 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685692 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.685695 master-0 kubenswrapper[7508]: I0313 10:36:02.685696 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685734 7508 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685764 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.185746864 +0000 UTC m=+1.928571981 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685731 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685782 7508 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685793 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685812 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685823 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.185812365 +0000 UTC m=+1.928637572 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685854 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685865 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: E0313 10:36:02.685874 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.185868077 +0000 UTC m=+1.928693194 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685904 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685925 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685930 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685970 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685980 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.686002 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.686025 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686051 master-0 kubenswrapper[7508]: I0313 10:36:02.685908 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686066 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686111 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686129 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686138 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686155 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686172 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686189 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.685612 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686212 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.186205246 +0000 UTC m=+1.929030483 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686233 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686239 7508 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686258 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686272 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.186262187 +0000 UTC m=+1.929087394 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686282 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686310 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686347 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686344 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686390 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686400 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686423 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686434 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: I0313 10:36:02.686439 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686470 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.186460992 +0000 UTC m=+1.929286209 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686495 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:02.686519 master-0 kubenswrapper[7508]: E0313 10:36:02.686530 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.186519574 +0000 UTC m=+1.929344791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686497 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686570 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686598 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686622 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686642 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686651 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686672 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: E0313 10:36:02.686696 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686697 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: E0313 10:36:02.686724 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:03.186714879 +0000 UTC m=+1.929540096 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686742 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686747 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686767 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686788 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686816 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686849 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686878 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686924 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686947 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686973 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.686977 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.687000 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.687059 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.687256 master-0 kubenswrapper[7508]: I0313 10:36:02.687147 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:02.703749 master-0 kubenswrapper[7508]: W0313 10:36:02.703665 7508 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 13 10:36:02.703749 master-0 kubenswrapper[7508]: E0313 10:36:02.703745 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:36:02.712780 master-0 kubenswrapper[7508]: I0313 10:36:02.712720 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:02.718291 master-0 kubenswrapper[7508]: I0313 10:36:02.718261 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:02.729045 master-0 kubenswrapper[7508]: I0313 10:36:02.728994 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:02.744361 master-0 kubenswrapper[7508]: E0313 10:36:02.744276 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:36:02.760540 master-0 kubenswrapper[7508]: I0313 10:36:02.759732 7508 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:36:02.760540 master-0 kubenswrapper[7508]: E0313 10:36:02.760364 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:02.782620 master-0 kubenswrapper[7508]: E0313 10:36:02.782514 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:36:02.856412 master-0 kubenswrapper[7508]: I0313 10:36:02.856350 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:36:02.857012 master-0 kubenswrapper[7508]: I0313 10:36:02.856959 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:36:02.859069 master-0 kubenswrapper[7508]: I0313 10:36:02.859020 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:02.870154 master-0 kubenswrapper[7508]: I0313 10:36:02.870074 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:02.888359 master-0 kubenswrapper[7508]: I0313 10:36:02.888294 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:36:02.906280 master-0 kubenswrapper[7508]: I0313 10:36:02.906188 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:02.928202 master-0 kubenswrapper[7508]: I0313 10:36:02.928122 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:02.947776 master-0 kubenswrapper[7508]: I0313 10:36:02.947632 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:02.968225 master-0 kubenswrapper[7508]: I0313 10:36:02.968172 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:02.987768 master-0 kubenswrapper[7508]: I0313 10:36:02.987712 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:36:03.007552 master-0 kubenswrapper[7508]: I0313 10:36:03.007507 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:36:03.024832 master-0 kubenswrapper[7508]: I0313 10:36:03.024784 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:03.049943 master-0 kubenswrapper[7508]: I0313 10:36:03.049893 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:36:03.075911 master-0 kubenswrapper[7508]: I0313 10:36:03.075868 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:36:03.091922 master-0 kubenswrapper[7508]: I0313 10:36:03.091871 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:03.107381 master-0 kubenswrapper[7508]: I0313 10:36:03.107345 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:03.127672 master-0 kubenswrapper[7508]: I0313 10:36:03.127633 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:03.146514 master-0 kubenswrapper[7508]: I0313 10:36:03.146476 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:36:03.154345 master-0 kubenswrapper[7508]: I0313 10:36:03.154272 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:03.167510 master-0 kubenswrapper[7508]: I0313 10:36:03.167478 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:36:03.191464 master-0 kubenswrapper[7508]: I0313 10:36:03.191424 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:36:03.192176 master-0 kubenswrapper[7508]: I0313 10:36:03.192156 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:03.209768 master-0 kubenswrapper[7508]: I0313 10:36:03.209691 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:03.228343 master-0 kubenswrapper[7508]: I0313 10:36:03.228310 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:36:03.242006 master-0 kubenswrapper[7508]: I0313 10:36:03.241976 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:03.242189 master-0 kubenswrapper[7508]: I0313 10:36:03.242014 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:03.242189 master-0 kubenswrapper[7508]: I0313 10:36:03.242047 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:03.242347 master-0 kubenswrapper[7508]: E0313 10:36:03.242265 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:03.242347 master-0 kubenswrapper[7508]: I0313 10:36:03.242301 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:03.242458 master-0 kubenswrapper[7508]: I0313 10:36:03.242360 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:03.242458 master-0 kubenswrapper[7508]: E0313 10:36:03.242386 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242357161 +0000 UTC m=+2.985182368 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:03.242458 master-0 kubenswrapper[7508]: E0313 10:36:03.242402 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:03.242458 master-0 kubenswrapper[7508]: I0313 10:36:03.242427 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:03.242458 master-0 kubenswrapper[7508]: E0313 10:36:03.242461 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242447293 +0000 UTC m=+2.985272410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242480 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242490 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242502 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242521 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242515085 +0000 UTC m=+2.985340202 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242538 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242526595 +0000 UTC m=+2.985351782 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242554 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242559 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242549426 +0000 UTC m=+2.985374653 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: E0313 10:36:03.242576 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242568376 +0000 UTC m=+2.985393603 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: I0313 10:36:03.242595 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: I0313 10:36:03.242624 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: I0313 10:36:03.242649 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:03.242673 master-0 kubenswrapper[7508]: I0313 10:36:03.242676 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242703 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: I0313 10:36:03.242708 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242733 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.2427232 +0000 UTC m=+2.985548318 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: I0313 10:36:03.242750 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242773 7508 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: I0313 10:36:03.242779 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242805 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242794142 +0000 UTC m=+2.985619339 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242849 7508 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242907 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242898385 +0000 UTC m=+2.985723502 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242851 7508 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242918 7508 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242932 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242927066 +0000 UTC m=+2.985752183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242882 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242946 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242937126 +0000 UTC m=+2.985762243 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242963 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.242957867 +0000 UTC m=+2.985782984 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.242988 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:03.243227 master-0 kubenswrapper[7508]: E0313 10:36:03.243022 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:04.243012888 +0000 UTC m=+2.985838015 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:03.247220 master-0 kubenswrapper[7508]: I0313 10:36:03.247202 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:36:03.268447 master-0 kubenswrapper[7508]: I0313 10:36:03.268403 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:36:03.291974 master-0 kubenswrapper[7508]: I0313 10:36:03.291925 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:03.306765 master-0 kubenswrapper[7508]: I0313 10:36:03.306708 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:36:03.329545 master-0 kubenswrapper[7508]: I0313 10:36:03.329460 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:36:03.349976 master-0 kubenswrapper[7508]: I0313 10:36:03.349923 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:03.366468 master-0 kubenswrapper[7508]: I0313 10:36:03.366432 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:03.385827 master-0 kubenswrapper[7508]: I0313 10:36:03.385780 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:36:03.432846 master-0 kubenswrapper[7508]: I0313 10:36:03.432796 7508 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 10:36:03.440294 master-0 kubenswrapper[7508]: I0313 10:36:03.439627 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:03.658082 master-0 kubenswrapper[7508]: I0313 10:36:03.658029 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:03.658412 master-0 kubenswrapper[7508]: E0313 10:36:03.658122 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3" Mar 13 10:36:03.658496 master-0 kubenswrapper[7508]: E0313 10:36:03.658439 7508 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kd99t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-5884b9cd56-t2xfz_openshift-etcd-operator(0932314b-ccf5-4be5-99f8-b99886392daa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 10:36:03.660209 master-0 kubenswrapper[7508]: E0313 10:36:03.660163 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" podUID="0932314b-ccf5-4be5-99f8-b99886392daa" Mar 13 10:36:03.744026 master-0 kubenswrapper[7508]: I0313 10:36:03.743959 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:03.782055 master-0 kubenswrapper[7508]: I0313 10:36:03.781996 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:04.256223 master-0 kubenswrapper[7508]: I0313 10:36:04.256174 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:04.256223 master-0 kubenswrapper[7508]: I0313 10:36:04.256239 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:04.256644 master-0 kubenswrapper[7508]: E0313 10:36:04.256408 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:04.256644 master-0 kubenswrapper[7508]: E0313 10:36:04.256515 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.256489741 +0000 UTC m=+4.999314858 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:04.256927 master-0 kubenswrapper[7508]: I0313 10:36:04.256889 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:04.256999 master-0 kubenswrapper[7508]: I0313 10:36:04.256941 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:04.256999 master-0 kubenswrapper[7508]: I0313 10:36:04.256975 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:04.257089 master-0 kubenswrapper[7508]: I0313 10:36:04.256999 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:04.257089 master-0 kubenswrapper[7508]: I0313 10:36:04.257040 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:04.257089 master-0 kubenswrapper[7508]: I0313 10:36:04.257069 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:04.257232 master-0 kubenswrapper[7508]: I0313 10:36:04.257122 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:04.257232 master-0 kubenswrapper[7508]: I0313 10:36:04.257159 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:04.257232 master-0 kubenswrapper[7508]: I0313 10:36:04.257186 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:04.257232 master-0 kubenswrapper[7508]: I0313 10:36:04.257228 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:04.257365 master-0 kubenswrapper[7508]: I0313 10:36:04.257254 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:04.257365 master-0 kubenswrapper[7508]: E0313 10:36:04.257349 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:04.257442 master-0 kubenswrapper[7508]: E0313 10:36:04.257382 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257371644 +0000 UTC m=+5.000196761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:04.257442 master-0 kubenswrapper[7508]: E0313 10:36:04.257433 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:04.257527 master-0 kubenswrapper[7508]: E0313 10:36:04.257459 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257450716 +0000 UTC m=+5.000275833 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:04.257527 master-0 kubenswrapper[7508]: E0313 10:36:04.257505 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:04.257641 master-0 kubenswrapper[7508]: E0313 10:36:04.257533 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257523518 +0000 UTC m=+5.000348645 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:04.257641 master-0 kubenswrapper[7508]: E0313 10:36:04.257579 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:04.257641 master-0 kubenswrapper[7508]: E0313 10:36:04.257604 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.25759574 +0000 UTC m=+5.000420857 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:04.257830 master-0 kubenswrapper[7508]: E0313 10:36:04.257644 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:04.257830 master-0 kubenswrapper[7508]: E0313 10:36:04.257670 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257661512 +0000 UTC m=+5.000486719 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:04.257830 master-0 kubenswrapper[7508]: E0313 10:36:04.257715 7508 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:04.257830 master-0 kubenswrapper[7508]: E0313 10:36:04.257817 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257804505 +0000 UTC m=+5.000629622 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:04.257973 master-0 kubenswrapper[7508]: E0313 10:36:04.257879 7508 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:04.257973 master-0 kubenswrapper[7508]: E0313 10:36:04.257907 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.257898738 +0000 UTC m=+5.000723855 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:04.257973 master-0 kubenswrapper[7508]: E0313 10:36:04.257958 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:04.258071 master-0 kubenswrapper[7508]: E0313 10:36:04.257986 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.25797615 +0000 UTC m=+5.000801267 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:04.258071 master-0 kubenswrapper[7508]: E0313 10:36:04.258036 7508 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:04.258071 master-0 kubenswrapper[7508]: E0313 10:36:04.258061 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.258053242 +0000 UTC m=+5.000878359 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:04.258179 master-0 kubenswrapper[7508]: E0313 10:36:04.258123 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:04.258179 master-0 kubenswrapper[7508]: E0313 10:36:04.258149 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.258141754 +0000 UTC m=+5.000966871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:04.258260 master-0 kubenswrapper[7508]: E0313 10:36:04.258193 7508 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:04.258260 master-0 kubenswrapper[7508]: E0313 10:36:04.258217 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.258209156 +0000 UTC m=+5.001034283 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:04.258610 master-0 kubenswrapper[7508]: E0313 10:36:04.258560 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:04.258710 master-0 kubenswrapper[7508]: E0313 10:36:04.258691 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:06.258653118 +0000 UTC m=+5.001478275 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:04.425628 master-0 kubenswrapper[7508]: E0313 10:36:04.425558 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b" Mar 13 10:36:04.425917 master-0 kubenswrapper[7508]: E0313 10:36:04.425814 7508 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b,Command:[cluster-openshift-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:ROUTE_CONTROLLER_MANAGER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rpnm8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-controller-manager-operator-8565d84698-4kpg8_openshift-controller-manager-operator(1f358d81-87c6-40bf-89e8-5681429285f8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 10:36:04.427144 master-0 kubenswrapper[7508]: E0313 10:36:04.427066 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-controller-manager-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" podUID="1f358d81-87c6-40bf-89e8-5681429285f8" Mar 13 10:36:04.584078 master-0 kubenswrapper[7508]: I0313 10:36:04.583961 7508 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:36:05.055239 master-0 kubenswrapper[7508]: E0313 10:36:05.055165 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953" Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: E0313 10:36:05.055417 7508 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: echo "Copying system trust bundle" Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: fi Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d3571ade02a7c61123d62c53fda6a57031a52c058c0571759dc09f96b23978f,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.34_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9c92k,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-7c6989d6c4-8kd6c_openshift-authentication-operator(ecb5bdcc-647d-4292-a33d-dc3df331c206): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 13 10:36:05.055519 master-0 kubenswrapper[7508]: > logger="UnhandledError" Mar 13 10:36:05.058047 master-0 kubenswrapper[7508]: E0313 10:36:05.057979 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" podUID="ecb5bdcc-647d-4292-a33d-dc3df331c206" Mar 13 10:36:05.514528 master-0 kubenswrapper[7508]: E0313 10:36:05.514431 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3" Mar 13 10:36:05.514853 master-0 kubenswrapper[7508]: E0313 10:36:05.514755 7508 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:csi-snapshot-controller-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3,Command:[],Args:[start -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERAND_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1,ValueFrom:nil,},EnvVar{Name:WEBHOOK_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d5e9989ee0577e930adcd97085176343a881bf92537dda1bf0325a3b1faf96d6,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-wrq5t,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-snapshot-controller-operator-5685fbc7d-pn89z_openshift-cluster-storage-operator(e87ca16c-25de-4fea-b900-2960f4a5f95e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 10:36:05.516150 master-0 kubenswrapper[7508]: E0313 10:36:05.516036 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"csi-snapshot-controller-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" podUID="e87ca16c-25de-4fea-b900-2960f4a5f95e" Mar 13 10:36:05.587261 master-0 kubenswrapper[7508]: I0313 10:36:05.587208 7508 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:36:05.941697 master-0 kubenswrapper[7508]: I0313 10:36:05.941615 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:05.942189 master-0 kubenswrapper[7508]: I0313 10:36:05.941814 7508 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:36:05.947168 master-0 kubenswrapper[7508]: I0313 10:36:05.947139 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:36:06.324014 master-0 kubenswrapper[7508]: I0313 10:36:06.323839 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:06.324014 master-0 kubenswrapper[7508]: I0313 10:36:06.323976 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:06.324014 master-0 kubenswrapper[7508]: E0313 10:36:06.324004 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: I0313 10:36:06.324043 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324083 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324062026 +0000 UTC m=+9.066887133 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: I0313 10:36:06.324122 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: I0313 10:36:06.324162 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324169 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324226 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324240 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.32422723 +0000 UTC m=+9.067052357 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324236 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324169 7508 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324270 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324255671 +0000 UTC m=+9.067080798 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324287 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324279872 +0000 UTC m=+9.067104989 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324311 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324299552 +0000 UTC m=+9.067124679 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: I0313 10:36:06.324306 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: E0313 10:36:06.324358 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:06.324362 master-0 kubenswrapper[7508]: I0313 10:36:06.324374 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324391 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324381644 +0000 UTC m=+9.067206761 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324412 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324457 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324478 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324488 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324479167 +0000 UTC m=+9.067304284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324545 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324559 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324578 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324568589 +0000 UTC m=+9.067393706 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324662 7508 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324722 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324736 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324787 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324772224 +0000 UTC m=+9.067597342 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324808 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324800045 +0000 UTC m=+9.067625162 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324806 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324822 7508 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324856 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324847116 +0000 UTC m=+9.067672233 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324874 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: I0313 10:36:06.324847 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:06.324911 master-0 kubenswrapper[7508]: E0313 10:36:06.324906 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.324896678 +0000 UTC m=+9.067721805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:06.325803 master-0 kubenswrapper[7508]: E0313 10:36:06.324990 7508 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:06.325803 master-0 kubenswrapper[7508]: E0313 10:36:06.325056 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:10.325044382 +0000 UTC m=+9.067869499 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:06.407779 master-0 kubenswrapper[7508]: E0313 10:36:06.407706 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460" Mar 13 10:36:06.408028 master-0 kubenswrapper[7508]: E0313 10:36:06.407967 7508 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kn5nv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-55t7x_openshift-network-operator(58685de6-b4ae-4229-870b-5143a6010450): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 10:36:06.409217 master-0 kubenswrapper[7508]: E0313 10:36:06.409145 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-network-operator/iptables-alerter-55t7x" podUID="58685de6-b4ae-4229-870b-5143a6010450" Mar 13 10:36:06.920553 master-0 kubenswrapper[7508]: E0313 10:36:06.920496 7508 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9" Mar 13 10:36:06.920898 master-0 kubenswrapper[7508]: E0313 10:36:06.920800 7508 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-storage-version-migrator-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,Command:[cluster-kube-storage-version-migrator-operator start],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hwfd8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1001,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-storage-version-migrator-operator-7f65c457f5-kxmt9_openshift-kube-storage-version-migrator-operator(ba3e43ba-2840-4612-a370-87ad3c5a382a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 13 10:36:06.922156 master-0 kubenswrapper[7508]: E0313 10:36:06.922046 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-storage-version-migrator-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" podUID="ba3e43ba-2840-4612-a370-87ad3c5a382a" Mar 13 10:36:07.392629 master-0 kubenswrapper[7508]: I0313 10:36:07.390649 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-jwfjl"] Mar 13 10:36:07.458934 master-0 kubenswrapper[7508]: W0313 10:36:07.458887 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7b698d2_f23a_4404_bc63_757ca549356f.slice/crio-eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9 WatchSource:0}: Error finding container eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9: Status 404 returned error can't find the container with id eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9 Mar 13 10:36:07.596248 master-0 kubenswrapper[7508]: I0313 10:36:07.596119 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"c24269090669e540d849b1a7ede32ee9641b8d7335ec065d4a9e4c4317788e00"} Mar 13 10:36:07.601175 master-0 kubenswrapper[7508]: I0313 10:36:07.600929 7508 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="57e72688ac44b6f412bc80bc5d4c7d9672ed6ce81db27dd8e0ee399b42f61ca3" exitCode=0 Mar 13 10:36:07.601175 master-0 kubenswrapper[7508]: I0313 10:36:07.601039 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"57e72688ac44b6f412bc80bc5d4c7d9672ed6ce81db27dd8e0ee399b42f61ca3"} Mar 13 10:36:07.604142 master-0 kubenswrapper[7508]: I0313 10:36:07.603059 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0"} Mar 13 10:36:07.614127 master-0 kubenswrapper[7508]: I0313 10:36:07.610808 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerStarted","Data":"e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e"} Mar 13 10:36:07.614127 master-0 kubenswrapper[7508]: I0313 10:36:07.612186 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jwfjl" event={"ID":"a7b698d2-f23a-4404-bc63-757ca549356f","Type":"ContainerStarted","Data":"dbde788ea183ad05575d070f12031405131e50e2eb12fce79b8429c063439949"} Mar 13 10:36:07.614127 master-0 kubenswrapper[7508]: I0313 10:36:07.612234 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jwfjl" event={"ID":"a7b698d2-f23a-4404-bc63-757ca549356f","Type":"ContainerStarted","Data":"eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9"} Mar 13 10:36:07.614127 master-0 kubenswrapper[7508]: I0313 10:36:07.612770 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:07.618024 master-0 kubenswrapper[7508]: I0313 10:36:07.617109 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerStarted","Data":"9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2"} Mar 13 10:36:07.619481 master-0 kubenswrapper[7508]: I0313 10:36:07.619365 7508 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="c5e876296b0a2729a3344c97bacebf2dce95059710f134fefa8e83abca942e51" exitCode=0 Mar 13 10:36:07.619481 master-0 kubenswrapper[7508]: I0313 10:36:07.619418 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"c5e876296b0a2729a3344c97bacebf2dce95059710f134fefa8e83abca942e51"} Mar 13 10:36:08.844125 master-0 kubenswrapper[7508]: I0313 10:36:08.843529 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:10.389524 master-0 kubenswrapper[7508]: I0313 10:36:10.389447 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:10.389524 master-0 kubenswrapper[7508]: I0313 10:36:10.389522 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389549 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389581 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389609 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389628 7508 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389637 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389628 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389696 7508 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389703 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls podName:e7d31378-e940-4473-ab37-10f250c76666 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389681793 +0000 UTC m=+17.132506910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls") pod "dns-operator-589895fbb7-6zkqh" (UID: "e7d31378-e940-4473-ab37-10f250c76666") : secret "metrics-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389748 7508 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389781 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389755345 +0000 UTC m=+17.132580462 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "performance-addon-operator-webhook-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389799 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls podName:cf740515-d70d-44b6-ac00-21143b5494d1 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389790486 +0000 UTC m=+17.132615603 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls") pod "ingress-operator-677db989d6-b2ss8" (UID: "cf740515-d70d-44b6-ac00-21143b5494d1") : secret "metrics-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389813 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert podName:b04498f0-5a3f-4461-aecb-50304662d854 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389806597 +0000 UTC m=+17.132631714 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert") pod "cluster-version-operator-745944c6b7-wlkwm" (UID: "b04498f0-5a3f-4461-aecb-50304662d854") : secret "cluster-version-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389838 7508 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389857 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389869 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls podName:25332da9-099c-4190-9e24-c19c86830a54 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389858988 +0000 UTC m=+17.132684105 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-cchhs" (UID: "25332da9-099c-4190-9e24-c19c86830a54") : secret "image-registry-operator-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389747 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389887 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389877798 +0000 UTC m=+17.132702915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389802 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389915 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389908089 +0000 UTC m=+17.132733206 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389912 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389948 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.389969 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.389989 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390002 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.389990271 +0000 UTC m=+17.132815388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.390029 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390047 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.390055 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390074 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.390066293 +0000 UTC m=+17.132891420 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: I0313 10:36:10.390110 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390135 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390167 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.390157216 +0000 UTC m=+17.132982333 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390187 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390209 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390218 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.390209207 +0000 UTC m=+17.133034334 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390237 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.390228768 +0000 UTC m=+17.133053885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:10.390269 master-0 kubenswrapper[7508]: E0313 10:36:10.390278 7508 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 13 10:36:10.392442 master-0 kubenswrapper[7508]: E0313 10:36:10.390307 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls podName:d9fd7b06-d61d-47c3-a08f-846245c79cc9 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.390299729 +0000 UTC m=+17.133124846 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-2qml7" (UID: "d9fd7b06-d61d-47c3-a08f-846245c79cc9") : secret "node-tuning-operator-tls" not found Mar 13 10:36:10.745105 master-0 kubenswrapper[7508]: I0313 10:36:10.744949 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: I0313 10:36:10.830894 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-xldln"] Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: E0313 10:36:10.831129 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: I0313 10:36:10.831146 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: E0313 10:36:10.831158 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: I0313 10:36:10.831167 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: I0313 10:36:10.831293 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="6c9c8030-b756-4ec4-b585-19672dc61df1" containerName="prober" Mar 13 10:36:10.831247 master-0 kubenswrapper[7508]: I0313 10:36:10.831309 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:36:10.837521 master-0 kubenswrapper[7508]: I0313 10:36:10.831663 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:10.837521 master-0 kubenswrapper[7508]: I0313 10:36:10.833176 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 10:36:10.837521 master-0 kubenswrapper[7508]: I0313 10:36:10.833711 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 10:36:10.837521 master-0 kubenswrapper[7508]: I0313 10:36:10.833974 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 10:36:10.837521 master-0 kubenswrapper[7508]: I0313 10:36:10.836044 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 10:36:10.843973 master-0 kubenswrapper[7508]: I0313 10:36:10.843928 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-xldln"] Mar 13 10:36:10.999511 master-0 kubenswrapper[7508]: I0313 10:36:10.999291 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htb49\" (UniqueName: \"kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:10.999707 master-0 kubenswrapper[7508]: I0313 10:36:10.999580 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:10.999707 master-0 kubenswrapper[7508]: I0313 10:36:10.999618 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.100485 master-0 kubenswrapper[7508]: I0313 10:36:11.100378 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htb49\" (UniqueName: \"kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.100754 master-0 kubenswrapper[7508]: I0313 10:36:11.100641 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.100754 master-0 kubenswrapper[7508]: I0313 10:36:11.100685 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.102740 master-0 kubenswrapper[7508]: I0313 10:36:11.102687 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.114024 master-0 kubenswrapper[7508]: I0313 10:36:11.113966 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.304309 master-0 kubenswrapper[7508]: I0313 10:36:11.304212 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htb49\" (UniqueName: \"kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.464610 master-0 kubenswrapper[7508]: I0313 10:36:11.464519 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:36:11.600203 master-0 kubenswrapper[7508]: I0313 10:36:11.600047 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:11.608968 master-0 kubenswrapper[7508]: I0313 10:36:11.608922 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:11.887839 master-0 kubenswrapper[7508]: I0313 10:36:11.887752 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:11.997425 master-0 kubenswrapper[7508]: I0313 10:36:11.997364 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:12.003222 master-0 kubenswrapper[7508]: I0313 10:36:12.003189 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:12.415149 master-0 kubenswrapper[7508]: I0313 10:36:12.414677 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-xldln"] Mar 13 10:36:12.482273 master-0 kubenswrapper[7508]: W0313 10:36:12.482068 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c3e94d4_5c6d_4092_975c_e5bca49eb397.slice/crio-b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f WatchSource:0}: Error finding container b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f: Status 404 returned error can't find the container with id b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f Mar 13 10:36:12.886122 master-0 kubenswrapper[7508]: I0313 10:36:12.886051 7508 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="136407fc6ee546951641a1123b4e37b22c08b30eef90bafae91497fd8eca613e" exitCode=0 Mar 13 10:36:12.886360 master-0 kubenswrapper[7508]: I0313 10:36:12.886137 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"136407fc6ee546951641a1123b4e37b22c08b30eef90bafae91497fd8eca613e"} Mar 13 10:36:12.887612 master-0 kubenswrapper[7508]: I0313 10:36:12.887566 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" event={"ID":"2c3e94d4-5c6d-4092-975c-e5bca49eb397","Type":"ContainerStarted","Data":"6e7919b9ec2a19d38d0cdba955ac4202dc210129fdef5e7c637e62cb54c916e6"} Mar 13 10:36:12.887708 master-0 kubenswrapper[7508]: I0313 10:36:12.887618 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" event={"ID":"2c3e94d4-5c6d-4092-975c-e5bca49eb397","Type":"ContainerStarted","Data":"b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f"} Mar 13 10:36:12.889802 master-0 kubenswrapper[7508]: I0313 10:36:12.889755 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3"} Mar 13 10:36:12.890063 master-0 kubenswrapper[7508]: I0313 10:36:12.890028 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:12.893442 master-0 kubenswrapper[7508]: I0313 10:36:12.893391 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:36:12.981960 master-0 kubenswrapper[7508]: I0313 10:36:12.981863 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" podStartSLOduration=2.98182492 podStartE2EDuration="2.98182492s" podCreationTimestamp="2026-03-13 10:36:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:12.980725831 +0000 UTC m=+11.723550958" watchObservedRunningTime="2026-03-13 10:36:12.98182492 +0000 UTC m=+11.724650047" Mar 13 10:36:15.458899 master-0 kubenswrapper[7508]: I0313 10:36:15.458846 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:16.937938 master-0 kubenswrapper[7508]: I0313 10:36:16.937473 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerStarted","Data":"b5048988f4d14da58f4ecce60f1b0f53c921c94b9f30bb0d6da211a5c6a3196b"} Mar 13 10:36:16.938804 master-0 kubenswrapper[7508]: I0313 10:36:16.938714 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"fba0ad5a7ea5359314eabe4a73e6d377274ab61d90c33e03f2dabdbba3678155"} Mar 13 10:36:16.940462 master-0 kubenswrapper[7508]: I0313 10:36:16.940427 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660"} Mar 13 10:36:17.792389 master-0 kubenswrapper[7508]: I0313 10:36:17.792341 7508 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" start-of-body= Mar 13 10:36:17.792489 master-0 kubenswrapper[7508]: I0313 10:36:17.792416 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" Mar 13 10:36:17.947565 master-0 kubenswrapper[7508]: I0313 10:36:17.947494 7508 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3" exitCode=0 Mar 13 10:36:17.948193 master-0 kubenswrapper[7508]: I0313 10:36:17.947590 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3"} Mar 13 10:36:17.948272 master-0 kubenswrapper[7508]: I0313 10:36:17.948200 7508 scope.go:117] "RemoveContainer" containerID="5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3" Mar 13 10:36:17.949963 master-0 kubenswrapper[7508]: I0313 10:36:17.949568 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-t2xfz_0932314b-ccf5-4be5-99f8-b99886392daa/etcd-operator/0.log" Mar 13 10:36:17.949963 master-0 kubenswrapper[7508]: I0313 10:36:17.949797 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerDied","Data":"fba0ad5a7ea5359314eabe4a73e6d377274ab61d90c33e03f2dabdbba3678155"} Mar 13 10:36:17.950139 master-0 kubenswrapper[7508]: I0313 10:36:17.950109 7508 scope.go:117] "RemoveContainer" containerID="fba0ad5a7ea5359314eabe4a73e6d377274ab61d90c33e03f2dabdbba3678155" Mar 13 10:36:17.950725 master-0 kubenswrapper[7508]: I0313 10:36:17.950632 7508 generic.go:334] "Generic (PLEG): container finished" podID="0932314b-ccf5-4be5-99f8-b99886392daa" containerID="fba0ad5a7ea5359314eabe4a73e6d377274ab61d90c33e03f2dabdbba3678155" exitCode=255 Mar 13 10:36:17.953450 master-0 kubenswrapper[7508]: I0313 10:36:17.953415 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-xvxcr_f8c7f667-d30e-41f4-8c0e-f3f138bffab4/cluster-olm-operator/0.log" Mar 13 10:36:17.954114 master-0 kubenswrapper[7508]: I0313 10:36:17.954063 7508 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660" exitCode=255 Mar 13 10:36:17.954114 master-0 kubenswrapper[7508]: I0313 10:36:17.954089 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660"} Mar 13 10:36:17.954432 master-0 kubenswrapper[7508]: I0313 10:36:17.954321 7508 scope.go:117] "RemoveContainer" containerID="c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660" Mar 13 10:36:18.110088 master-0 kubenswrapper[7508]: I0313 10:36:18.109691 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-7slkr"] Mar 13 10:36:18.110949 master-0 kubenswrapper[7508]: I0313 10:36:18.110917 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.121605 master-0 kubenswrapper[7508]: I0313 10:36:18.121554 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:36:18.121605 master-0 kubenswrapper[7508]: I0313 10:36:18.121597 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:36:18.121892 master-0 kubenswrapper[7508]: I0313 10:36:18.121554 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:36:18.121892 master-0 kubenswrapper[7508]: I0313 10:36:18.121770 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:36:18.121892 master-0 kubenswrapper[7508]: I0313 10:36:18.121887 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:36:18.124135 master-0 kubenswrapper[7508]: I0313 10:36:18.122109 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:18.130874 master-0 kubenswrapper[7508]: I0313 10:36:18.130828 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-7slkr"] Mar 13 10:36:18.186147 master-0 kubenswrapper[7508]: I0313 10:36:18.186081 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.186528 master-0 kubenswrapper[7508]: I0313 10:36:18.186505 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.186801 master-0 kubenswrapper[7508]: I0313 10:36:18.186777 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.187218 master-0 kubenswrapper[7508]: I0313 10:36:18.187182 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fq4nx\" (UniqueName: \"kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.187368 master-0 kubenswrapper[7508]: I0313 10:36:18.187350 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.288661 master-0 kubenswrapper[7508]: I0313 10:36:18.288482 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fq4nx\" (UniqueName: \"kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.288661 master-0 kubenswrapper[7508]: I0313 10:36:18.288547 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.288661 master-0 kubenswrapper[7508]: I0313 10:36:18.288617 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.288933 master-0 kubenswrapper[7508]: I0313 10:36:18.288683 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.288933 master-0 kubenswrapper[7508]: I0313 10:36:18.288735 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.289010 master-0 kubenswrapper[7508]: E0313 10:36:18.288962 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:18.289072 master-0 kubenswrapper[7508]: E0313 10:36:18.289031 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.789011559 +0000 UTC m=+17.531836676 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "client-ca" not found Mar 13 10:36:18.289072 master-0 kubenswrapper[7508]: E0313 10:36:18.289069 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 13 10:36:18.289187 master-0 kubenswrapper[7508]: E0313 10:36:18.289087 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.789081661 +0000 UTC m=+17.531906768 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "openshift-global-ca" not found Mar 13 10:36:18.289249 master-0 kubenswrapper[7508]: E0313 10:36:18.289191 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:18.289294 master-0 kubenswrapper[7508]: E0313 10:36:18.289261 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.789239625 +0000 UTC m=+17.532064752 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : secret "serving-cert" not found Mar 13 10:36:18.289294 master-0 kubenswrapper[7508]: E0313 10:36:18.289267 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 13 10:36:18.289294 master-0 kubenswrapper[7508]: E0313 10:36:18.289294 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:18.789286846 +0000 UTC m=+17.532111963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "config" not found Mar 13 10:36:18.321123 master-0 kubenswrapper[7508]: I0313 10:36:18.319497 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fq4nx\" (UniqueName: \"kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.389915 master-0 kubenswrapper[7508]: I0313 10:36:18.389847 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:18.389915 master-0 kubenswrapper[7508]: I0313 10:36:18.389903 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:18.389915 master-0 kubenswrapper[7508]: I0313 10:36:18.389923 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:18.390280 master-0 kubenswrapper[7508]: I0313 10:36:18.389945 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:18.390280 master-0 kubenswrapper[7508]: I0313 10:36:18.389973 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:18.390280 master-0 kubenswrapper[7508]: I0313 10:36:18.390004 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:18.390280 master-0 kubenswrapper[7508]: I0313 10:36:18.390023 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:18.390280 master-0 kubenswrapper[7508]: I0313 10:36:18.390053 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:18.390475 master-0 kubenswrapper[7508]: E0313 10:36:18.390420 7508 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:18.390521 master-0 kubenswrapper[7508]: E0313 10:36:18.390512 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls podName:17b956d3-c046-4f26-8be2-718c165a3acc nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.390492086 +0000 UTC m=+33.133317203 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-7rcdn" (UID: "17b956d3-c046-4f26-8be2-718c165a3acc") : secret "cluster-monitoring-operator-tls" not found Mar 13 10:36:18.394146 master-0 kubenswrapper[7508]: E0313 10:36:18.390810 7508 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 13 10:36:18.394146 master-0 kubenswrapper[7508]: E0313 10:36:18.390890 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.390869946 +0000 UTC m=+33.133695153 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : secret "multus-admission-controller-secret" not found Mar 13 10:36:18.394146 master-0 kubenswrapper[7508]: E0313 10:36:18.390941 7508 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 13 10:36:18.394146 master-0 kubenswrapper[7508]: E0313 10:36:18.390982 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics podName:1ef32245-c238-43c6-a57a-a5ac95aff1f7 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.390972339 +0000 UTC m=+33.133797446 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-4v99n" (UID: "1ef32245-c238-43c6-a57a-a5ac95aff1f7") : secret "marketplace-operator-metrics" not found Mar 13 10:36:18.401129 master-0 kubenswrapper[7508]: I0313 10:36:18.397668 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:18.401129 master-0 kubenswrapper[7508]: I0313 10:36:18.397732 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:18.401129 master-0 kubenswrapper[7508]: I0313 10:36:18.397745 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"cluster-version-operator-745944c6b7-wlkwm\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:18.401129 master-0 kubenswrapper[7508]: I0313 10:36:18.397731 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:18.401129 master-0 kubenswrapper[7508]: I0313 10:36:18.398248 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:18.452643 master-0 kubenswrapper[7508]: I0313 10:36:18.452578 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:18.491528 master-0 kubenswrapper[7508]: I0313 10:36:18.491470 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:18.491528 master-0 kubenswrapper[7508]: I0313 10:36:18.491533 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: E0313 10:36:18.491683 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: E0313 10:36:18.491783 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert podName:024d9bd3-ac77-4257-9808-7518f2a73e11 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.491760857 +0000 UTC m=+33.234586034 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert") pod "olm-operator-d64cfc9db-h46sf" (UID: "024d9bd3-ac77-4257-9808-7518f2a73e11") : secret "olm-operator-serving-cert" not found Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: I0313 10:36:18.491707 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: I0313 10:36:18.491836 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: E0313 10:36:18.491865 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 13 10:36:18.491914 master-0 kubenswrapper[7508]: E0313 10:36:18.491915 7508 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 13 10:36:18.492290 master-0 kubenswrapper[7508]: E0313 10:36:18.491911 7508 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 13 10:36:18.492290 master-0 kubenswrapper[7508]: I0313 10:36:18.491866 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:18.492290 master-0 kubenswrapper[7508]: E0313 10:36:18.491940 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert podName:a13f3e08-2b67-404f-8695-77aa17f92137 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.491932682 +0000 UTC m=+33.234757799 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-cfp26" (UID: "a13f3e08-2b67-404f-8695-77aa17f92137") : secret "package-server-manager-serving-cert" not found Mar 13 10:36:18.492290 master-0 kubenswrapper[7508]: E0313 10:36:18.492022 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs podName:8df2728b-4f21-4aef-b31f-4197bbcd2728 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.491998103 +0000 UTC m=+33.234823220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs") pod "network-metrics-daemon-c5vhc" (UID: "8df2728b-4f21-4aef-b31f-4197bbcd2728") : secret "metrics-daemon-secret" not found Mar 13 10:36:18.492674 master-0 kubenswrapper[7508]: E0313 10:36:18.492639 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert podName:03b97fde-467c-46f0-95f9-9c3820b4d790 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:34.49262037 +0000 UTC m=+33.235445497 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert") pod "catalog-operator-7d9c49f57b-tw9nm" (UID: "03b97fde-467c-46f0-95f9-9c3820b4d790") : secret "catalog-operator-serving-cert" not found Mar 13 10:36:18.495416 master-0 kubenswrapper[7508]: I0313 10:36:18.495380 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:18.653034 master-0 kubenswrapper[7508]: I0313 10:36:18.652943 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:36:18.655846 master-0 kubenswrapper[7508]: I0313 10:36:18.655807 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:36:18.655929 master-0 kubenswrapper[7508]: I0313 10:36:18.655871 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:18.661079 master-0 kubenswrapper[7508]: I0313 10:36:18.661038 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:36:18.661545 master-0 kubenswrapper[7508]: I0313 10:36:18.661506 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: I0313 10:36:18.799511 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: I0313 10:36:18.799775 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: I0313 10:36:18.799830 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: I0313 10:36:18.799867 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800055 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800134 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:19.800112949 +0000 UTC m=+18.542938066 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : secret "serving-cert" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800534 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800568 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:19.80055847 +0000 UTC m=+18.543383587 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "openshift-global-ca" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800599 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800621 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:19.800613492 +0000 UTC m=+18.543438609 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "client-ca" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800649 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 13 10:36:18.805847 master-0 kubenswrapper[7508]: E0313 10:36:18.800668 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:19.800662293 +0000 UTC m=+18.543487410 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "config" not found Mar 13 10:36:18.931976 master-0 kubenswrapper[7508]: I0313 10:36:18.931335 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-b2ss8"] Mar 13 10:36:19.182945 master-0 kubenswrapper[7508]: I0313 10:36:19.182831 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerStarted","Data":"cc39dd97fa33d7186bc0c795b8d5e196c978cac3bdc2c8d9dbf7380009448266"} Mar 13 10:36:19.184984 master-0 kubenswrapper[7508]: I0313 10:36:19.184950 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63"} Mar 13 10:36:19.185433 master-0 kubenswrapper[7508]: I0313 10:36:19.185410 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:19.187072 master-0 kubenswrapper[7508]: I0313 10:36:19.187042 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-t2xfz_0932314b-ccf5-4be5-99f8-b99886392daa/etcd-operator/0.log" Mar 13 10:36:19.187208 master-0 kubenswrapper[7508]: I0313 10:36:19.187112 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe"} Mar 13 10:36:19.188636 master-0 kubenswrapper[7508]: I0313 10:36:19.188596 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-xvxcr_f8c7f667-d30e-41f4-8c0e-f3f138bffab4/cluster-olm-operator/0.log" Mar 13 10:36:19.189252 master-0 kubenswrapper[7508]: I0313 10:36:19.189221 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"2011f2a930c1149a0110b2744b7cf0ecd80491982b05c3fd36024d0672252582"} Mar 13 10:36:19.190649 master-0 kubenswrapper[7508]: I0313 10:36:19.190618 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" event={"ID":"b04498f0-5a3f-4461-aecb-50304662d854","Type":"ContainerStarted","Data":"ca676bc5b7f3f8bc75644ceb62fe29437e3a8b2aa60b785e14180ce2eda8836e"} Mar 13 10:36:19.265118 master-0 kubenswrapper[7508]: I0313 10:36:19.264311 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7"] Mar 13 10:36:19.270259 master-0 kubenswrapper[7508]: I0313 10:36:19.269612 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-6zkqh"] Mar 13 10:36:19.272485 master-0 kubenswrapper[7508]: I0313 10:36:19.272180 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs"] Mar 13 10:36:19.466328 master-0 kubenswrapper[7508]: I0313 10:36:19.466275 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v"] Mar 13 10:36:19.466967 master-0 kubenswrapper[7508]: I0313 10:36:19.466935 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.467330 master-0 kubenswrapper[7508]: I0313 10:36:19.467309 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-7slkr"] Mar 13 10:36:19.467601 master-0 kubenswrapper[7508]: E0313 10:36:19.467575 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" podUID="60f91926-4b7d-4cb5-bd55-4600ff560156" Mar 13 10:36:19.468889 master-0 kubenswrapper[7508]: I0313 10:36:19.468848 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:36:19.472592 master-0 kubenswrapper[7508]: I0313 10:36:19.472567 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:36:19.472685 master-0 kubenswrapper[7508]: I0313 10:36:19.472611 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:36:19.472878 master-0 kubenswrapper[7508]: I0313 10:36:19.472854 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:36:19.473750 master-0 kubenswrapper[7508]: I0313 10:36:19.473723 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:19.482383 master-0 kubenswrapper[7508]: I0313 10:36:19.482346 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v"] Mar 13 10:36:19.784225 master-0 kubenswrapper[7508]: I0313 10:36:19.779441 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.784225 master-0 kubenswrapper[7508]: I0313 10:36:19.779865 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.784225 master-0 kubenswrapper[7508]: I0313 10:36:19.779948 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.784225 master-0 kubenswrapper[7508]: I0313 10:36:19.779993 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mq9dc\" (UniqueName: \"kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.881361 master-0 kubenswrapper[7508]: I0313 10:36:19.881300 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881414 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881461 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881483 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881513 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881534 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mq9dc\" (UniqueName: \"kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.881581 master-0 kubenswrapper[7508]: I0313 10:36:19.881577 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.881857 master-0 kubenswrapper[7508]: I0313 10:36:19.881598 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.882780 master-0 kubenswrapper[7508]: E0313 10:36:19.882253 7508 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:19.882890 master-0 kubenswrapper[7508]: E0313 10:36:19.882805 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:20.382779736 +0000 UTC m=+19.125604853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : secret "serving-cert" not found Mar 13 10:36:19.883250 master-0 kubenswrapper[7508]: E0313 10:36:19.882354 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:19.883365 master-0 kubenswrapper[7508]: E0313 10:36:19.883313 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.883281189 +0000 UTC m=+20.626106336 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : configmap "client-ca" not found Mar 13 10:36:19.883517 master-0 kubenswrapper[7508]: E0313 10:36:19.883463 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:19.883517 master-0 kubenswrapper[7508]: E0313 10:36:19.883501 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:20.383490115 +0000 UTC m=+19.126315232 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:19.883517 master-0 kubenswrapper[7508]: E0313 10:36:19.882445 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:19.883792 master-0 kubenswrapper[7508]: E0313 10:36:19.883529 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert podName:60f91926-4b7d-4cb5-bd55-4600ff560156 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.883522256 +0000 UTC m=+20.626347363 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert") pod "controller-manager-6f7fd6c796-7slkr" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156") : secret "serving-cert" not found Mar 13 10:36:19.884019 master-0 kubenswrapper[7508]: I0313 10:36:19.883999 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.884170 master-0 kubenswrapper[7508]: I0313 10:36:19.884152 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:19.884536 master-0 kubenswrapper[7508]: I0313 10:36:19.884476 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-7slkr\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:19.919794 master-0 kubenswrapper[7508]: I0313 10:36:19.919699 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mq9dc\" (UniqueName: \"kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:20.197683 master-0 kubenswrapper[7508]: I0313 10:36:20.197599 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" event={"ID":"d9fd7b06-d61d-47c3-a08f-846245c79cc9","Type":"ContainerStarted","Data":"c918fb3b270e41c6d62b6e571b5882afaab66a46ce66ce229de4e70f9853f259"} Mar 13 10:36:20.199559 master-0 kubenswrapper[7508]: I0313 10:36:20.199505 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" event={"ID":"25332da9-099c-4190-9e24-c19c86830a54","Type":"ContainerStarted","Data":"136e725a814882d97a92b91f392b5a4bb1498352a85819c564006fc0555c46b2"} Mar 13 10:36:20.202696 master-0 kubenswrapper[7508]: I0313 10:36:20.202606 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"bcfacb71ae88d504692e95ad77d6c9b51c2d2697daec2bf687474302cc5abf90"} Mar 13 10:36:20.210955 master-0 kubenswrapper[7508]: I0313 10:36:20.210144 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerStarted","Data":"02539d7838ebb483ffcca293d983b439f593e30b5eaf03def36de01bbe1607e5"} Mar 13 10:36:20.212635 master-0 kubenswrapper[7508]: I0313 10:36:20.211998 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:20.212635 master-0 kubenswrapper[7508]: I0313 10:36:20.212207 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"e34fa9d84124b6c127298dbbcc66ee1981c2d493a18d9fee5da615255d116cb0"} Mar 13 10:36:20.224181 master-0 kubenswrapper[7508]: I0313 10:36:20.224140 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:20.392386 master-0 kubenswrapper[7508]: I0313 10:36:20.392297 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") pod \"60f91926-4b7d-4cb5-bd55-4600ff560156\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " Mar 13 10:36:20.392386 master-0 kubenswrapper[7508]: I0313 10:36:20.392347 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") pod \"60f91926-4b7d-4cb5-bd55-4600ff560156\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " Mar 13 10:36:20.392727 master-0 kubenswrapper[7508]: I0313 10:36:20.392416 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fq4nx\" (UniqueName: \"kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx\") pod \"60f91926-4b7d-4cb5-bd55-4600ff560156\" (UID: \"60f91926-4b7d-4cb5-bd55-4600ff560156\") " Mar 13 10:36:20.392727 master-0 kubenswrapper[7508]: I0313 10:36:20.392522 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:20.392727 master-0 kubenswrapper[7508]: I0313 10:36:20.392611 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:20.392727 master-0 kubenswrapper[7508]: E0313 10:36:20.392706 7508 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:20.392987 master-0 kubenswrapper[7508]: E0313 10:36:20.392753 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.392738977 +0000 UTC m=+20.135564094 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : secret "serving-cert" not found Mar 13 10:36:20.392987 master-0 kubenswrapper[7508]: I0313 10:36:20.392843 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config" (OuterVolumeSpecName: "config") pod "60f91926-4b7d-4cb5-bd55-4600ff560156" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:20.392987 master-0 kubenswrapper[7508]: E0313 10:36:20.392918 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:20.392987 master-0 kubenswrapper[7508]: E0313 10:36:20.392952 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.392940232 +0000 UTC m=+20.135765359 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:20.395081 master-0 kubenswrapper[7508]: I0313 10:36:20.393299 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "60f91926-4b7d-4cb5-bd55-4600ff560156" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:20.399116 master-0 kubenswrapper[7508]: I0313 10:36:20.397038 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx" (OuterVolumeSpecName: "kube-api-access-fq4nx") pod "60f91926-4b7d-4cb5-bd55-4600ff560156" (UID: "60f91926-4b7d-4cb5-bd55-4600ff560156"). InnerVolumeSpecName "kube-api-access-fq4nx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:20.493346 master-0 kubenswrapper[7508]: I0313 10:36:20.493311 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:20.493346 master-0 kubenswrapper[7508]: I0313 10:36:20.493342 7508 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:20.493346 master-0 kubenswrapper[7508]: I0313 10:36:20.493352 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fq4nx\" (UniqueName: \"kubernetes.io/projected/60f91926-4b7d-4cb5-bd55-4600ff560156-kube-api-access-fq4nx\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:20.706971 master-0 kubenswrapper[7508]: I0313 10:36:20.706418 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k"] Mar 13 10:36:20.707254 master-0 kubenswrapper[7508]: I0313 10:36:20.707086 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:36:20.716891 master-0 kubenswrapper[7508]: I0313 10:36:20.716839 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k"] Mar 13 10:36:20.809636 master-0 kubenswrapper[7508]: I0313 10:36:20.809522 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5v4b\" (UniqueName: \"kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b\") pod \"csi-snapshot-controller-7577d6f48-kcw4k\" (UID: \"84f78350-e85c-4377-97cd-9e9a1b2ff4ee\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:36:20.864181 master-0 kubenswrapper[7508]: I0313 10:36:20.859232 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-kxkw5"] Mar 13 10:36:20.864181 master-0 kubenswrapper[7508]: I0313 10:36:20.860251 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:20.864181 master-0 kubenswrapper[7508]: I0313 10:36:20.862213 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:36:20.864181 master-0 kubenswrapper[7508]: I0313 10:36:20.862749 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 13 10:36:20.865200 master-0 kubenswrapper[7508]: I0313 10:36:20.865176 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 13 10:36:20.865707 master-0 kubenswrapper[7508]: I0313 10:36:20.865681 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:36:20.866410 master-0 kubenswrapper[7508]: I0313 10:36:20.865772 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:36:20.866410 master-0 kubenswrapper[7508]: I0313 10:36:20.865693 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:36:20.866707 master-0 kubenswrapper[7508]: I0313 10:36:20.866636 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:36:20.867266 master-0 kubenswrapper[7508]: I0313 10:36:20.867214 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:36:20.867413 master-0 kubenswrapper[7508]: I0313 10:36:20.867380 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:36:20.873328 master-0 kubenswrapper[7508]: I0313 10:36:20.873219 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-kxkw5"] Mar 13 10:36:20.876029 master-0 kubenswrapper[7508]: I0313 10:36:20.875822 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:36:20.913604 master-0 kubenswrapper[7508]: I0313 10:36:20.913528 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v4b\" (UniqueName: \"kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b\") pod \"csi-snapshot-controller-7577d6f48-kcw4k\" (UID: \"84f78350-e85c-4377-97cd-9e9a1b2ff4ee\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:36:20.943858 master-0 kubenswrapper[7508]: I0313 10:36:20.943754 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v4b\" (UniqueName: \"kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b\") pod \"csi-snapshot-controller-7577d6f48-kcw4k\" (UID: \"84f78350-e85c-4377-97cd-9e9a1b2ff4ee\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:36:21.014666 master-0 kubenswrapper[7508]: I0313 10:36:21.014601 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.014794 master-0 kubenswrapper[7508]: I0313 10:36:21.014691 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.014794 master-0 kubenswrapper[7508]: I0313 10:36:21.014719 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.014932 master-0 kubenswrapper[7508]: I0313 10:36:21.014850 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.014932 master-0 kubenswrapper[7508]: I0313 10:36:21.014894 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015078 master-0 kubenswrapper[7508]: I0313 10:36:21.014963 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015078 master-0 kubenswrapper[7508]: I0313 10:36:21.015020 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015296 master-0 kubenswrapper[7508]: I0313 10:36:21.015130 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015296 master-0 kubenswrapper[7508]: I0313 10:36:21.015278 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015407 master-0 kubenswrapper[7508]: I0313 10:36:21.015374 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.015817 master-0 kubenswrapper[7508]: I0313 10:36:21.015446 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvcmz\" (UniqueName: \"kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.037329 master-0 kubenswrapper[7508]: I0313 10:36:21.037285 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:36:21.116695 master-0 kubenswrapper[7508]: I0313 10:36:21.116350 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.116935 master-0 kubenswrapper[7508]: I0313 10:36:21.116792 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.116935 master-0 kubenswrapper[7508]: I0313 10:36:21.116860 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvcmz\" (UniqueName: \"kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117023 master-0 kubenswrapper[7508]: I0313 10:36:21.116983 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117023 master-0 kubenswrapper[7508]: I0313 10:36:21.117014 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117114 master-0 kubenswrapper[7508]: I0313 10:36:21.117041 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117256 master-0 kubenswrapper[7508]: I0313 10:36:21.117231 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117325 master-0 kubenswrapper[7508]: I0313 10:36:21.117295 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117358 master-0 kubenswrapper[7508]: I0313 10:36:21.117348 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117388 master-0 kubenswrapper[7508]: I0313 10:36:21.117364 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.117388 master-0 kubenswrapper[7508]: I0313 10:36:21.117382 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.118379 master-0 kubenswrapper[7508]: I0313 10:36:21.118289 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.118718 7508 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.118778 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.618759523 +0000 UTC m=+20.361584640 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "serving-cert" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119242 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: I0313 10:36:21.119270 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119250 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119322 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.619289767 +0000 UTC m=+20.362114994 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "etcd-serving-ca" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119325 7508 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119351 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.619338298 +0000 UTC m=+20.362163415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "audit-0" not found Mar 13 10:36:21.119776 master-0 kubenswrapper[7508]: E0313 10:36:21.119394 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:21.619375739 +0000 UTC m=+20.362200856 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "etcd-client" not found Mar 13 10:36:21.120256 master-0 kubenswrapper[7508]: I0313 10:36:21.119872 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.122840 master-0 kubenswrapper[7508]: I0313 10:36:21.122807 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.123624 master-0 kubenswrapper[7508]: I0313 10:36:21.123555 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.123860 master-0 kubenswrapper[7508]: I0313 10:36:21.123834 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.149295 master-0 kubenswrapper[7508]: I0313 10:36:21.145679 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvcmz\" (UniqueName: \"kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.233661 master-0 kubenswrapper[7508]: I0313 10:36:21.229418 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-7slkr" Mar 13 10:36:21.233661 master-0 kubenswrapper[7508]: I0313 10:36:21.229870 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerStarted","Data":"d028fc794a246b2460076d0dced5db6f65d2c7474177aae275ffc67970fe251d"} Mar 13 10:36:21.246798 master-0 kubenswrapper[7508]: I0313 10:36:21.246217 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:36:21.306907 master-0 kubenswrapper[7508]: I0313 10:36:21.291455 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k"] Mar 13 10:36:21.306907 master-0 kubenswrapper[7508]: I0313 10:36:21.301705 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-7slkr"] Mar 13 10:36:21.306907 master-0 kubenswrapper[7508]: I0313 10:36:21.301751 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-7slkr"] Mar 13 10:36:21.306907 master-0 kubenswrapper[7508]: W0313 10:36:21.302476 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod84f78350_e85c_4377_97cd_9e9a1b2ff4ee.slice/crio-3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c WatchSource:0}: Error finding container 3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c: Status 404 returned error can't find the container with id 3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c Mar 13 10:36:21.423256 master-0 kubenswrapper[7508]: E0313 10:36:21.423209 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:21.423682 master-0 kubenswrapper[7508]: E0313 10:36:21.423300 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:23.423279795 +0000 UTC m=+22.166104912 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:21.424116 master-0 kubenswrapper[7508]: I0313 10:36:21.423981 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:21.424316 master-0 kubenswrapper[7508]: I0313 10:36:21.424289 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:21.424378 master-0 kubenswrapper[7508]: I0313 10:36:21.424337 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/60f91926-4b7d-4cb5-bd55-4600ff560156-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:21.424378 master-0 kubenswrapper[7508]: I0313 10:36:21.424354 7508 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/60f91926-4b7d-4cb5-bd55-4600ff560156-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:21.424440 master-0 kubenswrapper[7508]: E0313 10:36:21.424432 7508 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:21.424486 master-0 kubenswrapper[7508]: E0313 10:36:21.424468 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:23.424455516 +0000 UTC m=+22.167280633 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : secret "serving-cert" not found Mar 13 10:36:21.505071 master-0 kubenswrapper[7508]: I0313 10:36:21.504955 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="60f91926-4b7d-4cb5-bd55-4600ff560156" path="/var/lib/kubelet/pods/60f91926-4b7d-4cb5-bd55-4600ff560156/volumes" Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: I0313 10:36:21.625983 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: I0313 10:36:21.626036 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.626164 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.626260 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:22.626235899 +0000 UTC m=+21.369061076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "audit-0" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: I0313 10:36:21.627142 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.627779 7508 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.627882 7508 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: I0313 10:36:21.627950 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.628013 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:22.627955774 +0000 UTC m=+21.370780891 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "etcd-client" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.628078 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:22.628071287 +0000 UTC m=+21.370896404 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "serving-cert" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.628024 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 10:36:21.629290 master-0 kubenswrapper[7508]: E0313 10:36:21.628223 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:22.628174389 +0000 UTC m=+21.370999576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "etcd-serving-ca" not found Mar 13 10:36:22.258537 master-0 kubenswrapper[7508]: I0313 10:36:22.257625 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l"] Mar 13 10:36:22.283119 master-0 kubenswrapper[7508]: I0313 10:36:22.282242 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l"] Mar 13 10:36:22.283119 master-0 kubenswrapper[7508]: I0313 10:36:22.282419 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:36:22.288681 master-0 kubenswrapper[7508]: I0313 10:36:22.288055 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 10:36:22.288681 master-0 kubenswrapper[7508]: I0313 10:36:22.288280 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c"} Mar 13 10:36:22.288681 master-0 kubenswrapper[7508]: I0313 10:36:22.288331 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 10:36:22.342128 master-0 kubenswrapper[7508]: I0313 10:36:22.338870 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mjm7\" (UniqueName: \"kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7\") pod \"migrator-57ccdf9b5-k9n8l\" (UID: \"fd91626c-38a8-462f-8bc0-96d57532de87\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:36:22.441305 master-0 kubenswrapper[7508]: I0313 10:36:22.441249 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjm7\" (UniqueName: \"kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7\") pod \"migrator-57ccdf9b5-k9n8l\" (UID: \"fd91626c-38a8-462f-8bc0-96d57532de87\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:36:22.471811 master-0 kubenswrapper[7508]: I0313 10:36:22.471729 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjm7\" (UniqueName: \"kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7\") pod \"migrator-57ccdf9b5-k9n8l\" (UID: \"fd91626c-38a8-462f-8bc0-96d57532de87\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:36:22.609175 master-0 kubenswrapper[7508]: I0313 10:36:22.609014 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:36:22.627932 master-0 kubenswrapper[7508]: I0313 10:36:22.627824 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-b7b97495d-hp7mr"] Mar 13 10:36:22.628498 master-0 kubenswrapper[7508]: I0313 10:36:22.628473 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.631571 master-0 kubenswrapper[7508]: I0313 10:36:22.630561 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:36:22.631571 master-0 kubenswrapper[7508]: I0313 10:36:22.631049 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:22.631571 master-0 kubenswrapper[7508]: I0313 10:36:22.631464 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:36:22.632277 master-0 kubenswrapper[7508]: I0313 10:36:22.632228 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:36:22.633954 master-0 kubenswrapper[7508]: I0313 10:36:22.632778 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:36:22.638870 master-0 kubenswrapper[7508]: I0313 10:36:22.637118 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-b7b97495d-hp7mr"] Mar 13 10:36:22.638870 master-0 kubenswrapper[7508]: I0313 10:36:22.638565 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642566 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642596 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642624 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642670 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: E0313 10:36:22.642784 7508 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: E0313 10:36:22.642832 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.642810542 +0000 UTC m=+23.385635659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "serving-cert" not found Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: E0313 10:36:22.642838 7508 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642827 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: E0313 10:36:22.642878 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.642867483 +0000 UTC m=+23.385692600 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "etcd-client" not found Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: I0313 10:36:22.642938 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.643042 master-0 kubenswrapper[7508]: E0313 10:36:22.643014 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: I0313 10:36:22.643108 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: E0313 10:36:22.643168 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: E0313 10:36:22.644737 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.644702851 +0000 UTC m=+23.387528018 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "etcd-serving-ca" not found Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: I0313 10:36:22.644792 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwwp2\" (UniqueName: \"kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: I0313 10:36:22.644812 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.646163 master-0 kubenswrapper[7508]: E0313 10:36:22.644864 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.644852185 +0000 UTC m=+23.387677302 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "audit-0" not found Mar 13 10:36:22.745977 master-0 kubenswrapper[7508]: I0313 10:36:22.745915 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwwp2\" (UniqueName: \"kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: I0313 10:36:22.746004 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: I0313 10:36:22.746223 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: E0313 10:36:22.746352 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: I0313 10:36:22.746366 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: E0313 10:36:22.746429 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:23.246406674 +0000 UTC m=+21.989231801 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : secret "serving-cert" not found Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: E0313 10:36:22.746444 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: E0313 10:36:22.746476 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:23.246463945 +0000 UTC m=+21.989289052 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : configmap "client-ca" not found Mar 13 10:36:22.747063 master-0 kubenswrapper[7508]: I0313 10:36:22.746542 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.748179 master-0 kubenswrapper[7508]: I0313 10:36:22.747712 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.748179 master-0 kubenswrapper[7508]: I0313 10:36:22.747869 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:22.815191 master-0 kubenswrapper[7508]: I0313 10:36:22.815143 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwwp2\" (UniqueName: \"kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:23.251634 master-0 kubenswrapper[7508]: I0313 10:36:23.251582 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:23.251634 master-0 kubenswrapper[7508]: I0313 10:36:23.251633 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:23.251879 master-0 kubenswrapper[7508]: E0313 10:36:23.251767 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:23.251879 master-0 kubenswrapper[7508]: E0313 10:36:23.251815 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.251801235 +0000 UTC m=+22.994626352 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : configmap "client-ca" not found Mar 13 10:36:23.251879 master-0 kubenswrapper[7508]: E0313 10:36:23.251875 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:23.251969 master-0 kubenswrapper[7508]: E0313 10:36:23.251898 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:24.251891737 +0000 UTC m=+22.994716854 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : secret "serving-cert" not found Mar 13 10:36:23.453236 master-0 kubenswrapper[7508]: I0313 10:36:23.453159 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:23.453995 master-0 kubenswrapper[7508]: I0313 10:36:23.453274 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:23.453995 master-0 kubenswrapper[7508]: E0313 10:36:23.453438 7508 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:23.453995 master-0 kubenswrapper[7508]: E0313 10:36:23.453488 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:27.453471645 +0000 UTC m=+26.196296762 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : secret "serving-cert" not found Mar 13 10:36:23.454126 master-0 kubenswrapper[7508]: E0313 10:36:23.453901 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:23.454126 master-0 kubenswrapper[7508]: E0313 10:36:23.454105 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:27.454080241 +0000 UTC m=+26.196905358 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:24.261425 master-0 kubenswrapper[7508]: I0313 10:36:24.261289 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:24.261425 master-0 kubenswrapper[7508]: I0313 10:36:24.261358 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:24.261666 master-0 kubenswrapper[7508]: E0313 10:36:24.261516 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:24.261666 master-0 kubenswrapper[7508]: E0313 10:36:24.261601 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:26.261582542 +0000 UTC m=+25.004407659 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : secret "serving-cert" not found Mar 13 10:36:24.261975 master-0 kubenswrapper[7508]: E0313 10:36:24.261861 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:24.261975 master-0 kubenswrapper[7508]: E0313 10:36:24.261957 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:26.261934291 +0000 UTC m=+25.004759408 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : configmap "client-ca" not found Mar 13 10:36:24.295237 master-0 kubenswrapper[7508]: I0313 10:36:24.294637 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-55t7x" event={"ID":"58685de6-b4ae-4229-870b-5143a6010450","Type":"ContainerStarted","Data":"31df9233b4a5d4d57a39c81be8f4431504aae76b625128b5139003e68085c9bf"} Mar 13 10:36:24.676663 master-0 kubenswrapper[7508]: I0313 10:36:24.676238 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:24.676663 master-0 kubenswrapper[7508]: I0313 10:36:24.676655 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676421 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: I0313 10:36:24.676766 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676800 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:28.676775691 +0000 UTC m=+27.419600868 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "audit-0" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676904 7508 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676941 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/etcd-serving-ca: configmap "etcd-serving-ca" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676970 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:28.676948435 +0000 UTC m=+27.419773622 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "serving-cert" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.676996 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:28.676981706 +0000 UTC m=+27.419806953 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-serving-ca" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "etcd-serving-ca" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.677047 7508 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: E0313 10:36:24.677075 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:28.677067518 +0000 UTC m=+27.419892725 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "etcd-client" not found Mar 13 10:36:24.677367 master-0 kubenswrapper[7508]: I0313 10:36:24.676903 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:26.299067 master-0 kubenswrapper[7508]: I0313 10:36:26.298980 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:26.299736 master-0 kubenswrapper[7508]: E0313 10:36:26.299192 7508 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:26.299736 master-0 kubenswrapper[7508]: I0313 10:36:26.299223 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:26.299736 master-0 kubenswrapper[7508]: E0313 10:36:26.299277 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:30.299252937 +0000 UTC m=+29.042078074 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : secret "serving-cert" not found Mar 13 10:36:26.299736 master-0 kubenswrapper[7508]: E0313 10:36:26.299317 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:26.299736 master-0 kubenswrapper[7508]: E0313 10:36:26.299367 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:30.299352019 +0000 UTC m=+29.042177136 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : configmap "client-ca" not found Mar 13 10:36:27.515937 master-0 kubenswrapper[7508]: I0313 10:36:27.515819 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:27.516609 master-0 kubenswrapper[7508]: E0313 10:36:27.516031 7508 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 13 10:36:27.516609 master-0 kubenswrapper[7508]: E0313 10:36:27.516139 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:35.516115224 +0000 UTC m=+34.258940411 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : secret "serving-cert" not found Mar 13 10:36:27.516609 master-0 kubenswrapper[7508]: I0313 10:36:27.516236 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:27.516609 master-0 kubenswrapper[7508]: E0313 10:36:27.516366 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:27.516609 master-0 kubenswrapper[7508]: E0313 10:36:27.516447 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:35.516427972 +0000 UTC m=+34.259253149 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:27.745669 master-0 kubenswrapper[7508]: I0313 10:36:27.745571 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:27.745986 master-0 kubenswrapper[7508]: I0313 10:36:27.745916 7508 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:36:27.773036 master-0 kubenswrapper[7508]: I0313 10:36:27.772600 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:36:28.732640 master-0 kubenswrapper[7508]: I0313 10:36:28.732545 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:28.732640 master-0 kubenswrapper[7508]: I0313 10:36:28.732653 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: I0313 10:36:28.732712 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: E0313 10:36:28.732914 7508 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: E0313 10:36:28.733053 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:36.733021024 +0000 UTC m=+35.475846171 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : secret "etcd-client" not found Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: I0313 10:36:28.733198 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: E0313 10:36:28.733486 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 13 10:36:28.733617 master-0 kubenswrapper[7508]: E0313 10:36:28.733602 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:36.733573888 +0000 UTC m=+35.476399025 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : configmap "audit-0" not found Mar 13 10:36:28.734470 master-0 kubenswrapper[7508]: I0313 10:36:28.734409 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:28.739129 master-0 kubenswrapper[7508]: I0313 10:36:28.739053 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:30.354133 master-0 kubenswrapper[7508]: I0313 10:36:30.354007 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:30.355486 master-0 kubenswrapper[7508]: I0313 10:36:30.355423 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:30.355768 master-0 kubenswrapper[7508]: E0313 10:36:30.355715 7508 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:30.355896 master-0 kubenswrapper[7508]: E0313 10:36:30.355819 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca podName:db40c498-c94c-4009-9620-5b4ff9e28668 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:38.355787887 +0000 UTC m=+37.098613044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca") pod "controller-manager-b7b97495d-hp7mr" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668") : configmap "client-ca" not found Mar 13 10:36:30.361346 master-0 kubenswrapper[7508]: I0313 10:36:30.361286 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:30.662124 master-0 kubenswrapper[7508]: I0313 10:36:30.662056 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:36:30.662548 master-0 kubenswrapper[7508]: I0313 10:36:30.662528 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.675124 master-0 kubenswrapper[7508]: I0313 10:36:30.671746 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 10:36:30.761793 master-0 kubenswrapper[7508]: I0313 10:36:30.761748 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.761793 master-0 kubenswrapper[7508]: I0313 10:36:30.761789 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.761793 master-0 kubenswrapper[7508]: I0313 10:36:30.761812 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.862790 master-0 kubenswrapper[7508]: I0313 10:36:30.862709 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.862790 master-0 kubenswrapper[7508]: I0313 10:36:30.862773 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.862790 master-0 kubenswrapper[7508]: I0313 10:36:30.862798 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.863174 master-0 kubenswrapper[7508]: I0313 10:36:30.862979 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:30.863298 master-0 kubenswrapper[7508]: I0313 10:36:30.863275 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:33.304660 master-0 kubenswrapper[7508]: I0313 10:36:33.304196 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:36:34.417002 master-0 kubenswrapper[7508]: I0313 10:36:34.416900 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:34.417002 master-0 kubenswrapper[7508]: I0313 10:36:34.417001 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:34.417828 master-0 kubenswrapper[7508]: I0313 10:36:34.417067 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:34.424650 master-0 kubenswrapper[7508]: I0313 10:36:34.424311 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:34.424777 master-0 kubenswrapper[7508]: I0313 10:36:34.424416 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:34.424777 master-0 kubenswrapper[7508]: I0313 10:36:34.424562 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:34.518532 master-0 kubenswrapper[7508]: I0313 10:36:34.518409 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:34.518532 master-0 kubenswrapper[7508]: I0313 10:36:34.518547 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:34.518970 master-0 kubenswrapper[7508]: I0313 10:36:34.518671 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:34.519308 master-0 kubenswrapper[7508]: I0313 10:36:34.519224 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:34.524601 master-0 kubenswrapper[7508]: I0313 10:36:34.524553 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:34.525514 master-0 kubenswrapper[7508]: I0313 10:36:34.525462 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:34.525592 master-0 kubenswrapper[7508]: I0313 10:36:34.525520 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:34.525780 master-0 kubenswrapper[7508]: I0313 10:36:34.525726 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:34.552256 master-0 kubenswrapper[7508]: I0313 10:36:34.552166 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:34.552256 master-0 kubenswrapper[7508]: I0313 10:36:34.552204 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:34.559899 master-0 kubenswrapper[7508]: I0313 10:36:34.559841 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:36:34.560139 master-0 kubenswrapper[7508]: I0313 10:36:34.559931 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:34.561085 master-0 kubenswrapper[7508]: I0313 10:36:34.560982 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:34.568670 master-0 kubenswrapper[7508]: I0313 10:36:34.568574 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:36:34.568939 master-0 kubenswrapper[7508]: I0313 10:36:34.568623 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:36:34.650125 master-0 kubenswrapper[7508]: I0313 10:36:34.647811 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:34.883457 master-0 kubenswrapper[7508]: I0313 10:36:34.883368 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:36:35.196296 master-0 kubenswrapper[7508]: I0313 10:36:35.184815 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-kxkw5"] Mar 13 10:36:35.196296 master-0 kubenswrapper[7508]: E0313 10:36:35.185173 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit etcd-client], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" podUID="f20123c9-6e65-4b74-919f-8b399022e0f5" Mar 13 10:36:35.502249 master-0 kubenswrapper[7508]: I0313 10:36:35.498040 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:35.535120 master-0 kubenswrapper[7508]: I0313 10:36:35.534693 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:35.535120 master-0 kubenswrapper[7508]: I0313 10:36:35.534784 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:35.539121 master-0 kubenswrapper[7508]: E0313 10:36:35.536248 7508 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 13 10:36:35.539121 master-0 kubenswrapper[7508]: E0313 10:36:35.536369 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca podName:f640710b-2b20-438d-b847-18b8fcb77b4c nodeName:}" failed. No retries permitted until 2026-03-13 10:36:51.536342733 +0000 UTC m=+50.279167850 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca") pod "route-controller-manager-787464f5f6-wjm6v" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c") : configmap "client-ca" not found Mar 13 10:36:35.552275 master-0 kubenswrapper[7508]: I0313 10:36:35.547460 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"route-controller-manager-787464f5f6-wjm6v\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:35.561121 master-0 kubenswrapper[7508]: I0313 10:36:35.556167 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:35.610125 master-0 kubenswrapper[7508]: I0313 10:36:35.610002 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5"] Mar 13 10:36:35.615119 master-0 kubenswrapper[7508]: I0313 10:36:35.610900 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.615119 master-0 kubenswrapper[7508]: I0313 10:36:35.613681 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 10:36:35.619116 master-0 kubenswrapper[7508]: I0313 10:36:35.616080 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 10:36:35.619116 master-0 kubenswrapper[7508]: I0313 10:36:35.616306 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 10:36:35.639110 master-0 kubenswrapper[7508]: I0313 10:36:35.638429 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5"] Mar 13 10:36:35.712346 master-0 kubenswrapper[7508]: I0313 10:36:35.711824 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt"] Mar 13 10:36:35.712654 master-0 kubenswrapper[7508]: I0313 10:36:35.712564 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.716829 master-0 kubenswrapper[7508]: I0313 10:36:35.715807 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 10:36:35.716829 master-0 kubenswrapper[7508]: I0313 10:36:35.716196 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 10:36:35.716829 master-0 kubenswrapper[7508]: I0313 10:36:35.716399 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 10:36:35.727236 master-0 kubenswrapper[7508]: I0313 10:36:35.726077 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt"] Mar 13 10:36:35.727236 master-0 kubenswrapper[7508]: I0313 10:36:35.727198 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 10:36:35.738384 master-0 kubenswrapper[7508]: I0313 10:36:35.738350 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.738577 master-0 kubenswrapper[7508]: I0313 10:36:35.738558 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.738717 master-0 kubenswrapper[7508]: I0313 10:36:35.738698 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.738823 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.738874 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mvcmz\" (UniqueName: \"kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.738901 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.738923 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.738969 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739003 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739154 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739197 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jk4qr\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739226 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nqrh5\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739270 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739313 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739336 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739362 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739386 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739443 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739479 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739507 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.739650 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:35.740167 master-0 kubenswrapper[7508]: I0313 10:36:35.740065 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:35.741231 master-0 kubenswrapper[7508]: I0313 10:36:35.740779 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:35.741231 master-0 kubenswrapper[7508]: I0313 10:36:35.740800 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:35.743532 master-0 kubenswrapper[7508]: I0313 10:36:35.741485 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:35.743532 master-0 kubenswrapper[7508]: I0313 10:36:35.742943 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:35.751506 master-0 kubenswrapper[7508]: I0313 10:36:35.746065 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz" (OuterVolumeSpecName: "kube-api-access-mvcmz") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "kube-api-access-mvcmz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:35.751506 master-0 kubenswrapper[7508]: I0313 10:36:35.748039 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config" (OuterVolumeSpecName: "config") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:35.752428 master-0 kubenswrapper[7508]: I0313 10:36:35.752328 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:35.840439 master-0 kubenswrapper[7508]: I0313 10:36:35.840380 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.840439 master-0 kubenswrapper[7508]: I0313 10:36:35.840440 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4qr\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.840837 master-0 kubenswrapper[7508]: I0313 10:36:35.840464 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrh5\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.840970 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841036 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841063 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841113 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841146 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841173 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841223 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841245 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841289 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841303 7508 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841317 7508 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f20123c9-6e65-4b74-919f-8b399022e0f5-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841330 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841342 7508 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841355 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mvcmz\" (UniqueName: \"kubernetes.io/projected/f20123c9-6e65-4b74-919f-8b399022e0f5-kube-api-access-mvcmz\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841368 7508 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841379 7508 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841392 7508 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841531 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841558 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841600 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.841673 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.842045 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.842684 master-0 kubenswrapper[7508]: I0313 10:36:35.842688 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.848739 master-0 kubenswrapper[7508]: I0313 10:36:35.845456 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.848739 master-0 kubenswrapper[7508]: I0313 10:36:35.845868 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.848739 master-0 kubenswrapper[7508]: I0313 10:36:35.848676 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.864461 master-0 kubenswrapper[7508]: I0313 10:36:35.864292 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrh5\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:35.893325 master-0 kubenswrapper[7508]: I0313 10:36:35.893270 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4qr\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:35.958524 master-0 kubenswrapper[7508]: I0313 10:36:35.958446 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:36.044362 master-0 kubenswrapper[7508]: I0313 10:36:36.044167 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:36.397033 master-0 kubenswrapper[7508]: I0313 10:36:36.394887 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l"] Mar 13 10:36:36.450652 master-0 kubenswrapper[7508]: W0313 10:36:36.450490 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd91626c_38a8_462f_8bc0_96d57532de87.slice/crio-1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d WatchSource:0}: Error finding container 1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d: Status 404 returned error can't find the container with id 1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d Mar 13 10:36:36.522434 master-0 kubenswrapper[7508]: I0313 10:36:36.522402 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:36.523329 master-0 kubenswrapper[7508]: I0313 10:36:36.522368 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d"} Mar 13 10:36:36.726209 master-0 kubenswrapper[7508]: I0313 10:36:36.718874 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-kxkw5"] Mar 13 10:36:36.726826 master-0 kubenswrapper[7508]: I0313 10:36:36.726670 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:36.728320 master-0 kubenswrapper[7508]: I0313 10:36:36.727888 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.734267 master-0 kubenswrapper[7508]: I0313 10:36:36.731614 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-74b98ff8f9-kxkw5"] Mar 13 10:36:36.742110 master-0 kubenswrapper[7508]: I0313 10:36:36.742058 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:36:36.742110 master-0 kubenswrapper[7508]: I0313 10:36:36.742083 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:36:36.742466 master-0 kubenswrapper[7508]: I0313 10:36:36.742435 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:36:36.742816 master-0 kubenswrapper[7508]: I0313 10:36:36.742797 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:36:36.743080 master-0 kubenswrapper[7508]: I0313 10:36:36.743042 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:36.743189 master-0 kubenswrapper[7508]: I0313 10:36:36.743157 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:36:36.743291 master-0 kubenswrapper[7508]: I0313 10:36:36.743268 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:36:36.743722 master-0 kubenswrapper[7508]: I0313 10:36:36.743704 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:36:36.744038 master-0 kubenswrapper[7508]: I0313 10:36:36.744021 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:36:36.744560 master-0 kubenswrapper[7508]: I0313 10:36:36.744534 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:36:36.747790 master-0 kubenswrapper[7508]: I0313 10:36:36.746036 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:36:36.757040 master-0 kubenswrapper[7508]: I0313 10:36:36.753272 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:36.757040 master-0 kubenswrapper[7508]: I0313 10:36:36.753333 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:36.757040 master-0 kubenswrapper[7508]: E0313 10:36:36.753543 7508 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: object "openshift-apiserver"/"audit-0" not registered Mar 13 10:36:36.757040 master-0 kubenswrapper[7508]: E0313 10:36:36.753588 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit podName:f20123c9-6e65-4b74-919f-8b399022e0f5 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:52.75357389 +0000 UTC m=+51.496399007 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit") pod "apiserver-74b98ff8f9-kxkw5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5") : object "openshift-apiserver"/"audit-0" not registered Mar 13 10:36:36.762453 master-0 kubenswrapper[7508]: I0313 10:36:36.762208 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"apiserver-74b98ff8f9-kxkw5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " pod="openshift-apiserver/apiserver-74b98ff8f9-kxkw5" Mar 13 10:36:36.803167 master-0 kubenswrapper[7508]: I0313 10:36:36.798348 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c5vhc"] Mar 13 10:36:36.854126 master-0 kubenswrapper[7508]: I0313 10:36:36.853892 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") pod \"f20123c9-6e65-4b74-919f-8b399022e0f5\" (UID: \"f20123c9-6e65-4b74-919f-8b399022e0f5\") " Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854230 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854271 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854294 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854318 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm5z8\" (UniqueName: \"kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854845 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854879 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854898 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854923 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.854976 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.855004 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.855024 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.855055 7508 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/f20123c9-6e65-4b74-919f-8b399022e0f5-audit\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:36.858200 master-0 kubenswrapper[7508]: I0313 10:36:36.857237 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "f20123c9-6e65-4b74-919f-8b399022e0f5" (UID: "f20123c9-6e65-4b74-919f-8b399022e0f5"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:36.948630 master-0 kubenswrapper[7508]: I0313 10:36:36.948563 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:36:36.958809 master-0 kubenswrapper[7508]: I0313 10:36:36.958743 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.958809 master-0 kubenswrapper[7508]: I0313 10:36:36.958790 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959071 master-0 kubenswrapper[7508]: I0313 10:36:36.958820 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959071 master-0 kubenswrapper[7508]: I0313 10:36:36.959034 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959071 master-0 kubenswrapper[7508]: I0313 10:36:36.959053 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959173 master-0 kubenswrapper[7508]: I0313 10:36:36.959075 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm5z8\" (UniqueName: \"kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959173 master-0 kubenswrapper[7508]: I0313 10:36:36.959122 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959173 master-0 kubenswrapper[7508]: I0313 10:36:36.959138 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959173 master-0 kubenswrapper[7508]: I0313 10:36:36.959153 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959173 master-0 kubenswrapper[7508]: I0313 10:36:36.959172 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959318 master-0 kubenswrapper[7508]: I0313 10:36:36.959194 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.959318 master-0 kubenswrapper[7508]: I0313 10:36:36.959222 7508 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f20123c9-6e65-4b74-919f-8b399022e0f5-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.960846 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.961232 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.961667 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.962603 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.963060 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.964495 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.964550 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.972138 master-0 kubenswrapper[7508]: I0313 10:36:36.966982 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.976060 master-0 kubenswrapper[7508]: I0313 10:36:36.975982 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.976522 master-0 kubenswrapper[7508]: I0313 10:36:36.976415 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:36.987786 master-0 kubenswrapper[7508]: I0313 10:36:36.985820 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm5z8\" (UniqueName: \"kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8\") pod \"apiserver-567956995b-dmf5x\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:37.020573 master-0 kubenswrapper[7508]: I0313 10:36:37.020232 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-4v99n"] Mar 13 10:36:37.083299 master-0 kubenswrapper[7508]: I0313 10:36:37.082819 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf"] Mar 13 10:36:37.085324 master-0 kubenswrapper[7508]: I0313 10:36:37.085291 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm"] Mar 13 10:36:37.094621 master-0 kubenswrapper[7508]: I0313 10:36:37.088344 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:37.105003 master-0 kubenswrapper[7508]: I0313 10:36:37.104642 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:36:37.113351 master-0 kubenswrapper[7508]: I0313 10:36:37.111036 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26"] Mar 13 10:36:37.113351 master-0 kubenswrapper[7508]: I0313 10:36:37.111084 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5"] Mar 13 10:36:37.130338 master-0 kubenswrapper[7508]: I0313 10:36:37.127778 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:36:37.142187 master-0 kubenswrapper[7508]: I0313 10:36:37.142142 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt"] Mar 13 10:36:37.166241 master-0 kubenswrapper[7508]: I0313 10:36:37.166204 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn"] Mar 13 10:36:37.173911 master-0 kubenswrapper[7508]: I0313 10:36:37.173876 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-mzx9f"] Mar 13 10:36:37.174772 master-0 kubenswrapper[7508]: I0313 10:36:37.174757 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.177730 master-0 kubenswrapper[7508]: W0313 10:36:37.176347 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5da919b6_8545_4001_89f3_74cb289327f0.slice/crio-9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190 WatchSource:0}: Error finding container 9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190: Status 404 returned error can't find the container with id 9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190 Mar 13 10:36:37.178791 master-0 kubenswrapper[7508]: I0313 10:36:37.178692 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178791 master-0 kubenswrapper[7508]: I0313 10:36:37.178726 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178791 master-0 kubenswrapper[7508]: I0313 10:36:37.178743 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178791 master-0 kubenswrapper[7508]: I0313 10:36:37.178764 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178791 master-0 kubenswrapper[7508]: I0313 10:36:37.178785 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178814 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178831 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178870 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178885 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178921 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178955 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.178984 master-0 kubenswrapper[7508]: I0313 10:36:37.178973 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5gkd\" (UniqueName: \"kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.179354 master-0 kubenswrapper[7508]: I0313 10:36:37.178993 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.179354 master-0 kubenswrapper[7508]: I0313 10:36:37.179015 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282408 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282457 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282476 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gkd\" (UniqueName: \"kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282492 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282508 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282527 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282547 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282626 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282693 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282806 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282872 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282954 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282978 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.282996 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283030 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283046 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283071 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283089 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283116 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283207 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283234 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283254 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283295 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283298 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.283443 master-0 kubenswrapper[7508]: I0313 10:36:37.283412 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.290617 master-0 kubenswrapper[7508]: I0313 10:36:37.288008 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.308633 master-0 kubenswrapper[7508]: I0313 10:36:37.308379 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.315201 master-0 kubenswrapper[7508]: I0313 10:36:37.314976 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gkd\" (UniqueName: \"kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.340066 master-0 kubenswrapper[7508]: I0313 10:36:37.339986 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b7b97495d-hp7mr"] Mar 13 10:36:37.340374 master-0 kubenswrapper[7508]: E0313 10:36:37.340348 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" podUID="db40c498-c94c-4009-9620-5b4ff9e28668" Mar 13 10:36:37.381953 master-0 kubenswrapper[7508]: I0313 10:36:37.381901 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v"] Mar 13 10:36:37.382247 master-0 kubenswrapper[7508]: E0313 10:36:37.382193 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" podUID="f640710b-2b20-438d-b847-18b8fcb77b4c" Mar 13 10:36:37.583721 master-0 kubenswrapper[7508]: I0313 10:36:37.580713 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:36:37.596720 master-0 kubenswrapper[7508]: I0313 10:36:37.596671 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f20123c9-6e65-4b74-919f-8b399022e0f5" path="/var/lib/kubelet/pods/f20123c9-6e65-4b74-919f-8b399022e0f5/volumes" Mar 13 10:36:37.605617 master-0 kubenswrapper[7508]: I0313 10:36:37.604461 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" event={"ID":"b04498f0-5a3f-4461-aecb-50304662d854","Type":"ContainerStarted","Data":"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc"} Mar 13 10:36:37.806823 master-0 kubenswrapper[7508]: I0313 10:36:37.806754 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:37.821064 master-0 kubenswrapper[7508]: I0313 10:36:37.819076 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"f8c529cacd73744ab46d9439b7345a1d27edc1e2d71b7933b404f4206bf30909"} Mar 13 10:36:37.821064 master-0 kubenswrapper[7508]: I0313 10:36:37.819315 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"e0d3f2c007226936b12b661f6223b55c53b4c84c882223c0c75ca57b895fa28c"} Mar 13 10:36:37.852873 master-0 kubenswrapper[7508]: I0313 10:36:37.852557 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" event={"ID":"024d9bd3-ac77-4257-9808-7518f2a73e11","Type":"ContainerStarted","Data":"390d92c6b1bf8de4d4ea48cb675d878d3b2cbd2b0311fc47e5e4feef80f55449"} Mar 13 10:36:37.854961 master-0 kubenswrapper[7508]: I0313 10:36:37.854001 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" event={"ID":"03b97fde-467c-46f0-95f9-9c3820b4d790","Type":"ContainerStarted","Data":"64cac6ba3a561adbc8f8770dc2f28e49933388f06613c25151f7bbd0ceb39107"} Mar 13 10:36:37.869511 master-0 kubenswrapper[7508]: I0313 10:36:37.869449 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerStarted","Data":"9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190"} Mar 13 10:36:37.891284 master-0 kubenswrapper[7508]: I0313 10:36:37.891228 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" event={"ID":"17b956d3-c046-4f26-8be2-718c165a3acc","Type":"ContainerStarted","Data":"19f35bad4079f0b545148fd4db4666ab80db062f38092a6802b80cab4ec7982a"} Mar 13 10:36:37.905278 master-0 kubenswrapper[7508]: I0313 10:36:37.905215 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"4a344875d4670ed9716f0cef98985188762c8daf81f4743d50027d07c28af916"} Mar 13 10:36:37.905278 master-0 kubenswrapper[7508]: I0313 10:36:37.905271 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748"} Mar 13 10:36:37.914488 master-0 kubenswrapper[7508]: I0313 10:36:37.914377 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e"} Mar 13 10:36:37.928415 master-0 kubenswrapper[7508]: I0313 10:36:37.928358 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"77b4f8a8bc891942c93fc6bc58a70209e4d2685ce12294e206b71662186490b9"} Mar 13 10:36:37.929853 master-0 kubenswrapper[7508]: I0313 10:36:37.929820 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"2e4a3a4a7895f019e0118f1584bc95eca1f9c60af18c9d3fe595f768be766c6d"} Mar 13 10:36:37.937145 master-0 kubenswrapper[7508]: I0313 10:36:37.936871 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" event={"ID":"d9fd7b06-d61d-47c3-a08f-846245c79cc9","Type":"ContainerStarted","Data":"d551846b834f3c792666af696b3893004dc6412c55eeea4b5cdb805b2eaffa1b"} Mar 13 10:36:37.947238 master-0 kubenswrapper[7508]: I0313 10:36:37.947169 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"851fb998ee7d34cb6bb04d5f4061e13a565db5d18010b2516dd1dd436a846840"} Mar 13 10:36:37.947238 master-0 kubenswrapper[7508]: I0313 10:36:37.947228 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"26320b73ca3fce1850dde3e75da5ccc58878b72f0f352ff1a9c176723a2b7d3d"} Mar 13 10:36:37.955382 master-0 kubenswrapper[7508]: I0313 10:36:37.951623 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e1a3cdd6-88be-4a7f-955c-2f0b22082e82","Type":"ContainerStarted","Data":"98d3834f79a7a852f9b92d014f5509a2a10b0b4a9a2902b60d45ac88cc6cadb6"} Mar 13 10:36:37.955382 master-0 kubenswrapper[7508]: I0313 10:36:37.952626 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" event={"ID":"25332da9-099c-4190-9e24-c19c86830a54","Type":"ContainerStarted","Data":"997999accb5a6bff6c2c6f0ce4bfa996a8b256c62954c05e165b3f90b0b8f80d"} Mar 13 10:36:37.958185 master-0 kubenswrapper[7508]: I0313 10:36:37.958138 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"2fe7b69e87a4fa6425da976dffbe87c8c66862e1127867967d8f83ef262d49b7"} Mar 13 10:36:37.959742 master-0 kubenswrapper[7508]: I0313 10:36:37.959720 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:37.960229 master-0 kubenswrapper[7508]: I0313 10:36:37.960156 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"646d9925ac7d679e5fe105dacc2e5ba2bf65b630c171bd0e095c89f902ecba0a"} Mar 13 10:36:37.980265 master-0 kubenswrapper[7508]: I0313 10:36:37.980217 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:38.164709 master-0 kubenswrapper[7508]: I0313 10:36:38.164603 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podStartSLOduration=3.133700915 podStartE2EDuration="18.164546381s" podCreationTimestamp="2026-03-13 10:36:20 +0000 UTC" firstStartedPulling="2026-03-13 10:36:21.3134401 +0000 UTC m=+20.056265227" lastFinishedPulling="2026-03-13 10:36:36.344285576 +0000 UTC m=+35.087110693" observedRunningTime="2026-03-13 10:36:37.961635638 +0000 UTC m=+36.704460785" watchObservedRunningTime="2026-03-13 10:36:38.164546381 +0000 UTC m=+36.907371498" Mar 13 10:36:38.165775 master-0 kubenswrapper[7508]: I0313 10:36:38.165556 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:38.173564 master-0 kubenswrapper[7508]: I0313 10:36:38.167913 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-qt95m"] Mar 13 10:36:38.173564 master-0 kubenswrapper[7508]: I0313 10:36:38.171162 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.175799 master-0 kubenswrapper[7508]: I0313 10:36:38.175395 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 10:36:38.175799 master-0 kubenswrapper[7508]: I0313 10:36:38.175707 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 10:36:38.176429 master-0 kubenswrapper[7508]: I0313 10:36:38.176031 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qt95m"] Mar 13 10:36:38.176429 master-0 kubenswrapper[7508]: I0313 10:36:38.176407 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 10:36:38.176756 master-0 kubenswrapper[7508]: I0313 10:36:38.176074 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 10:36:38.186626 master-0 kubenswrapper[7508]: I0313 10:36:38.184952 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.331799 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") pod \"f640710b-2b20-438d-b847-18b8fcb77b4c\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.331874 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mq9dc\" (UniqueName: \"kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc\") pod \"f640710b-2b20-438d-b847-18b8fcb77b4c\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.331915 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles\") pod \"db40c498-c94c-4009-9620-5b4ff9e28668\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.331946 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") pod \"db40c498-c94c-4009-9620-5b4ff9e28668\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.331973 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwwp2\" (UniqueName: \"kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2\") pod \"db40c498-c94c-4009-9620-5b4ff9e28668\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.332006 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config\") pod \"db40c498-c94c-4009-9620-5b4ff9e28668\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " Mar 13 10:36:38.332371 master-0 kubenswrapper[7508]: I0313 10:36:38.332062 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config\") pod \"f640710b-2b20-438d-b847-18b8fcb77b4c\" (UID: \"f640710b-2b20-438d-b847-18b8fcb77b4c\") " Mar 13 10:36:38.332814 master-0 kubenswrapper[7508]: I0313 10:36:38.332417 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.332814 master-0 kubenswrapper[7508]: I0313 10:36:38.332478 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gcf6\" (UniqueName: \"kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.332814 master-0 kubenswrapper[7508]: I0313 10:36:38.332607 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.334475 master-0 kubenswrapper[7508]: I0313 10:36:38.334426 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "db40c498-c94c-4009-9620-5b4ff9e28668" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:38.334706 master-0 kubenswrapper[7508]: I0313 10:36:38.334673 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config" (OuterVolumeSpecName: "config") pod "db40c498-c94c-4009-9620-5b4ff9e28668" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:38.335208 master-0 kubenswrapper[7508]: I0313 10:36:38.335029 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config" (OuterVolumeSpecName: "config") pod "f640710b-2b20-438d-b847-18b8fcb77b4c" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:38.338784 master-0 kubenswrapper[7508]: I0313 10:36:38.338734 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f640710b-2b20-438d-b847-18b8fcb77b4c" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:38.343828 master-0 kubenswrapper[7508]: I0313 10:36:38.343762 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "db40c498-c94c-4009-9620-5b4ff9e28668" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:38.357627 master-0 kubenswrapper[7508]: I0313 10:36:38.347674 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:38.361963 master-0 kubenswrapper[7508]: I0313 10:36:38.361693 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc" (OuterVolumeSpecName: "kube-api-access-mq9dc") pod "f640710b-2b20-438d-b847-18b8fcb77b4c" (UID: "f640710b-2b20-438d-b847-18b8fcb77b4c"). InnerVolumeSpecName "kube-api-access-mq9dc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:38.375345 master-0 kubenswrapper[7508]: I0313 10:36:38.375268 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2" (OuterVolumeSpecName: "kube-api-access-vwwp2") pod "db40c498-c94c-4009-9620-5b4ff9e28668" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668"). InnerVolumeSpecName "kube-api-access-vwwp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434371 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434427 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcf6\" (UniqueName: \"kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434452 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434472 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434534 7508 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434544 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/db40c498-c94c-4009-9620-5b4ff9e28668-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434553 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vwwp2\" (UniqueName: \"kubernetes.io/projected/db40c498-c94c-4009-9620-5b4ff9e28668-kube-api-access-vwwp2\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434565 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434573 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434581 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f640710b-2b20-438d-b847-18b8fcb77b4c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.434589 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mq9dc\" (UniqueName: \"kubernetes.io/projected/f640710b-2b20-438d-b847-18b8fcb77b4c-kube-api-access-mq9dc\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: E0313 10:36:38.434707 7508 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: E0313 10:36:38.434770 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls podName:3e15f776-d153-4289-91c7-893584104185 nodeName:}" failed. No retries permitted until 2026-03-13 10:36:38.934741748 +0000 UTC m=+37.677566865 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls") pod "dns-default-qt95m" (UID: "3e15f776-d153-4289-91c7-893584104185") : secret "dns-default-metrics-tls" not found Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.435715 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.438174 master-0 kubenswrapper[7508]: I0313 10:36:38.436309 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"controller-manager-b7b97495d-hp7mr\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:38.476121 master-0 kubenswrapper[7508]: I0313 10:36:38.470812 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcf6\" (UniqueName: \"kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.546335 master-0 kubenswrapper[7508]: I0313 10:36:38.540910 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") pod \"db40c498-c94c-4009-9620-5b4ff9e28668\" (UID: \"db40c498-c94c-4009-9620-5b4ff9e28668\") " Mar 13 10:36:38.546335 master-0 kubenswrapper[7508]: I0313 10:36:38.541541 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca" (OuterVolumeSpecName: "client-ca") pod "db40c498-c94c-4009-9620-5b4ff9e28668" (UID: "db40c498-c94c-4009-9620-5b4ff9e28668"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:38.610771 master-0 kubenswrapper[7508]: I0313 10:36:38.610413 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:36:38.642297 master-0 kubenswrapper[7508]: I0313 10:36:38.641682 7508 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/db40c498-c94c-4009-9620-5b4ff9e28668-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:38.695273 master-0 kubenswrapper[7508]: I0313 10:36:38.695132 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-d542b"] Mar 13 10:36:38.696173 master-0 kubenswrapper[7508]: I0313 10:36:38.695842 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.863129 master-0 kubenswrapper[7508]: I0313 10:36:38.844760 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppjzw\" (UniqueName: \"kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.863129 master-0 kubenswrapper[7508]: I0313 10:36:38.845088 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.951183 master-0 kubenswrapper[7508]: I0313 10:36:38.947000 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.951183 master-0 kubenswrapper[7508]: I0313 10:36:38.947117 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.951183 master-0 kubenswrapper[7508]: I0313 10:36:38.947257 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjzw\" (UniqueName: \"kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.951183 master-0 kubenswrapper[7508]: I0313 10:36:38.948074 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.965171 master-0 kubenswrapper[7508]: I0313 10:36:38.957935 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:38.965171 master-0 kubenswrapper[7508]: I0313 10:36:38.964984 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjzw\" (UniqueName: \"kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:38.993589 master-0 kubenswrapper[7508]: I0313 10:36:38.993513 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" event={"ID":"a3a72b45-a705-4335-9c04-c952ec5d9975","Type":"ContainerStarted","Data":"7b8b432491b64c35241699cea9dca0847beab01faa6b11ea1ee81f0edac7188e"} Mar 13 10:36:38.993589 master-0 kubenswrapper[7508]: I0313 10:36:38.993584 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" event={"ID":"a3a72b45-a705-4335-9c04-c952ec5d9975","Type":"ContainerStarted","Data":"bb70bbe39b0a248a6aa4cef7e86697f7d917e3ba95ec678efc7f04cb53a9a7e7"} Mar 13 10:36:39.008052 master-0 kubenswrapper[7508]: I0313 10:36:39.006998 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-567956995b-dmf5x" event={"ID":"742db892-aef7-428e-b0c0-b54c6c9bf48e","Type":"ContainerStarted","Data":"cbaf41dab9a2ee348b55ecc6df287959a27f53d4c1058f3224a3b3927f09019e"} Mar 13 10:36:39.016049 master-0 kubenswrapper[7508]: I0313 10:36:39.015971 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" podStartSLOduration=2.015953477 podStartE2EDuration="2.015953477s" podCreationTimestamp="2026-03-13 10:36:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:39.015226728 +0000 UTC m=+37.758051845" watchObservedRunningTime="2026-03-13 10:36:39.015953477 +0000 UTC m=+37.758778594" Mar 13 10:36:39.022601 master-0 kubenswrapper[7508]: I0313 10:36:39.022534 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e1a3cdd6-88be-4a7f-955c-2f0b22082e82","Type":"ContainerStarted","Data":"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0"} Mar 13 10:36:39.034764 master-0 kubenswrapper[7508]: I0313 10:36:39.034705 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c"} Mar 13 10:36:39.034764 master-0 kubenswrapper[7508]: I0313 10:36:39.034768 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"640ae6e09ed226b337075233b9303b1fb0d56099898746f5ff9f07d686060f2d"} Mar 13 10:36:39.035702 master-0 kubenswrapper[7508]: I0313 10:36:39.035676 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:39.035819 master-0 kubenswrapper[7508]: I0313 10:36:39.035800 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-d542b" Mar 13 10:36:39.050692 master-0 kubenswrapper[7508]: I0313 10:36:39.046942 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-b7b97495d-hp7mr" Mar 13 10:36:39.050692 master-0 kubenswrapper[7508]: I0313 10:36:39.047068 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"4e35d9e42c3db125a61c2fa53787bbec93b1a84b0ca9bbb457199baa790d8533"} Mar 13 10:36:39.050692 master-0 kubenswrapper[7508]: I0313 10:36:39.047178 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946"} Mar 13 10:36:39.050692 master-0 kubenswrapper[7508]: I0313 10:36:39.047300 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v" Mar 13 10:36:39.089034 master-0 kubenswrapper[7508]: I0313 10:36:39.088963 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=11.0889379 podStartE2EDuration="11.0889379s" podCreationTimestamp="2026-03-13 10:36:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:39.044008758 +0000 UTC m=+37.786833875" watchObservedRunningTime="2026-03-13 10:36:39.0889379 +0000 UTC m=+37.831763017" Mar 13 10:36:39.089857 master-0 kubenswrapper[7508]: I0313 10:36:39.089817 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podStartSLOduration=4.089811113 podStartE2EDuration="4.089811113s" podCreationTimestamp="2026-03-13 10:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:39.088077338 +0000 UTC m=+37.830902475" watchObservedRunningTime="2026-03-13 10:36:39.089811113 +0000 UTC m=+37.832636230" Mar 13 10:36:39.101977 master-0 kubenswrapper[7508]: I0313 10:36:39.101941 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:39.226179 master-0 kubenswrapper[7508]: I0313 10:36:39.225890 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podStartSLOduration=4.225861431 podStartE2EDuration="4.225861431s" podCreationTimestamp="2026-03-13 10:36:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:39.208060567 +0000 UTC m=+37.950885694" watchObservedRunningTime="2026-03-13 10:36:39.225861431 +0000 UTC m=+37.968686548" Mar 13 10:36:39.248425 master-0 kubenswrapper[7508]: I0313 10:36:39.248359 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v"] Mar 13 10:36:39.252443 master-0 kubenswrapper[7508]: I0313 10:36:39.252276 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:36:39.253614 master-0 kubenswrapper[7508]: I0313 10:36:39.253586 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-787464f5f6-wjm6v"] Mar 13 10:36:39.253974 master-0 kubenswrapper[7508]: I0313 10:36:39.253937 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.259920 master-0 kubenswrapper[7508]: I0313 10:36:39.259666 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:36:39.260055 master-0 kubenswrapper[7508]: I0313 10:36:39.260037 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:36:39.260243 master-0 kubenswrapper[7508]: I0313 10:36:39.260226 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:36:39.261114 master-0 kubenswrapper[7508]: I0313 10:36:39.260418 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:39.263942 master-0 kubenswrapper[7508]: I0313 10:36:39.263709 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:36:39.264590 master-0 kubenswrapper[7508]: I0313 10:36:39.264564 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:36:39.300263 master-0 kubenswrapper[7508]: I0313 10:36:39.293340 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-b7b97495d-hp7mr"] Mar 13 10:36:39.301774 master-0 kubenswrapper[7508]: I0313 10:36:39.301738 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-b7b97495d-hp7mr"] Mar 13 10:36:39.353980 master-0 kubenswrapper[7508]: I0313 10:36:39.353945 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.354206 master-0 kubenswrapper[7508]: I0313 10:36:39.354186 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.354363 master-0 kubenswrapper[7508]: I0313 10:36:39.354338 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.369539 master-0 kubenswrapper[7508]: I0313 10:36:39.369477 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjbpt\" (UniqueName: \"kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.369933 master-0 kubenswrapper[7508]: I0313 10:36:39.369918 7508 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f640710b-2b20-438d-b847-18b8fcb77b4c-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.471317 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.471667 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjbpt\" (UniqueName: \"kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.471724 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.471753 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.472978 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.474484 master-0 kubenswrapper[7508]: I0313 10:36:39.473826 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.481244 master-0 kubenswrapper[7508]: I0313 10:36:39.480033 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.497882 master-0 kubenswrapper[7508]: I0313 10:36:39.497806 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjbpt\" (UniqueName: \"kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt\") pod \"route-controller-manager-7c67dff5b-jmqzk\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:39.515482 master-0 kubenswrapper[7508]: I0313 10:36:39.515377 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db40c498-c94c-4009-9620-5b4ff9e28668" path="/var/lib/kubelet/pods/db40c498-c94c-4009-9620-5b4ff9e28668/volumes" Mar 13 10:36:39.515896 master-0 kubenswrapper[7508]: I0313 10:36:39.515830 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f640710b-2b20-438d-b847-18b8fcb77b4c" path="/var/lib/kubelet/pods/f640710b-2b20-438d-b847-18b8fcb77b4c/volumes" Mar 13 10:36:39.564473 master-0 kubenswrapper[7508]: W0313 10:36:39.561959 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3e15f776_d153_4289_91c7_893584104185.slice/crio-0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc WatchSource:0}: Error finding container 0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc: Status 404 returned error can't find the container with id 0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc Mar 13 10:36:39.576402 master-0 kubenswrapper[7508]: I0313 10:36:39.571375 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-qt95m"] Mar 13 10:36:39.598149 master-0 kubenswrapper[7508]: I0313 10:36:39.596953 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:40.047486 master-0 kubenswrapper[7508]: I0313 10:36:40.047431 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:36:40.059244 master-0 kubenswrapper[7508]: I0313 10:36:40.059132 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qt95m" event={"ID":"3e15f776-d153-4289-91c7-893584104185","Type":"ContainerStarted","Data":"0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc"} Mar 13 10:36:40.062617 master-0 kubenswrapper[7508]: I0313 10:36:40.062588 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d542b" event={"ID":"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c","Type":"ContainerStarted","Data":"9e9770f157a4ec6cd726bd326d6c98845c1b4b7c517bb15dcbd0850a4395d902"} Mar 13 10:36:40.062705 master-0 kubenswrapper[7508]: I0313 10:36:40.062621 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d542b" event={"ID":"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c","Type":"ContainerStarted","Data":"2659c5a6a41b8bd57f0bf3c1da691ca647e461b974a89f7c9f8fe2c464e9654a"} Mar 13 10:36:40.062705 master-0 kubenswrapper[7508]: I0313 10:36:40.062641 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:40.062862 master-0 kubenswrapper[7508]: I0313 10:36:40.062821 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" containerName="installer" containerID="cri-o://a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0" gracePeriod=30 Mar 13 10:36:40.109261 master-0 kubenswrapper[7508]: I0313 10:36:40.107857 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-d542b" podStartSLOduration=2.107813103 podStartE2EDuration="2.107813103s" podCreationTimestamp="2026-03-13 10:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:40.105985586 +0000 UTC m=+38.848810723" watchObservedRunningTime="2026-03-13 10:36:40.107813103 +0000 UTC m=+38.850638220" Mar 13 10:36:41.017804 master-0 kubenswrapper[7508]: I0313 10:36:41.017416 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:41.020479 master-0 kubenswrapper[7508]: I0313 10:36:41.020440 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.030172 master-0 kubenswrapper[7508]: I0313 10:36:41.027410 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:41.182468 master-0 kubenswrapper[7508]: I0313 10:36:41.182233 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.182468 master-0 kubenswrapper[7508]: I0313 10:36:41.182337 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.182468 master-0 kubenswrapper[7508]: I0313 10:36:41.182364 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.362912 master-0 kubenswrapper[7508]: I0313 10:36:41.289872 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.362912 master-0 kubenswrapper[7508]: I0313 10:36:41.290001 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.362912 master-0 kubenswrapper[7508]: I0313 10:36:41.290160 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.362912 master-0 kubenswrapper[7508]: I0313 10:36:41.291070 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.427779 master-0 kubenswrapper[7508]: I0313 10:36:41.291378 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.475846 master-0 kubenswrapper[7508]: I0313 10:36:41.475749 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access\") pod \"installer-2-master-0\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.492256 master-0 kubenswrapper[7508]: I0313 10:36:41.490276 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-999d99f5f-hlk52"] Mar 13 10:36:41.492256 master-0 kubenswrapper[7508]: I0313 10:36:41.491294 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.494574 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.494907 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.495069 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.495303 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.495571 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.495982 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 10:36:41.496395 master-0 kubenswrapper[7508]: I0313 10:36:41.496253 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 10:36:41.496972 master-0 kubenswrapper[7508]: I0313 10:36:41.496497 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 10:36:41.520538 master-0 kubenswrapper[7508]: I0313 10:36:41.520470 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520538 master-0 kubenswrapper[7508]: I0313 10:36:41.520530 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520555 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nr6p\" (UniqueName: \"kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520582 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520600 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520630 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520649 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.520922 master-0 kubenswrapper[7508]: I0313 10:36:41.520664 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624005 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624044 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624060 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624080 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624128 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624144 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr6p\" (UniqueName: \"kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624168 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624185 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.624892 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.626418 master-0 kubenswrapper[7508]: I0313 10:36:41.625517 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.628271 master-0 kubenswrapper[7508]: I0313 10:36:41.628221 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-999d99f5f-hlk52"] Mar 13 10:36:41.630293 master-0 kubenswrapper[7508]: I0313 10:36:41.630230 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.631662 master-0 kubenswrapper[7508]: I0313 10:36:41.630783 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.633969 master-0 kubenswrapper[7508]: I0313 10:36:41.632366 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.633969 master-0 kubenswrapper[7508]: I0313 10:36:41.632802 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.636273 master-0 kubenswrapper[7508]: I0313 10:36:41.635833 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.664049 master-0 kubenswrapper[7508]: I0313 10:36:41.663503 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:41.676539 master-0 kubenswrapper[7508]: I0313 10:36:41.676397 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr6p\" (UniqueName: \"kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:41.694528 master-0 kubenswrapper[7508]: I0313 10:36:41.692508 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:36:41.694528 master-0 kubenswrapper[7508]: I0313 10:36:41.693120 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.703270 master-0 kubenswrapper[7508]: I0313 10:36:41.698259 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:36:41.707065 master-0 kubenswrapper[7508]: I0313 10:36:41.706858 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:36:41.707889 master-0 kubenswrapper[7508]: I0313 10:36:41.707820 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:36:41.708552 master-0 kubenswrapper[7508]: I0313 10:36:41.708509 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:36:41.708644 master-0 kubenswrapper[7508]: I0313 10:36:41.708574 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:41.709029 master-0 kubenswrapper[7508]: I0313 10:36:41.708938 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:36:41.713318 master-0 kubenswrapper[7508]: I0313 10:36:41.713182 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:36:41.929626 master-0 kubenswrapper[7508]: I0313 10:36:41.929477 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.929626 master-0 kubenswrapper[7508]: I0313 10:36:41.929555 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.929626 master-0 kubenswrapper[7508]: I0313 10:36:41.929610 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.929886 master-0 kubenswrapper[7508]: I0313 10:36:41.929654 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.929886 master-0 kubenswrapper[7508]: I0313 10:36:41.929682 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9szlb\" (UniqueName: \"kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:41.929972 master-0 kubenswrapper[7508]: I0313 10:36:41.929955 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:36:42.031405 master-0 kubenswrapper[7508]: I0313 10:36:42.031334 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.031633 master-0 kubenswrapper[7508]: I0313 10:36:42.031409 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.031633 master-0 kubenswrapper[7508]: I0313 10:36:42.031487 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9szlb\" (UniqueName: \"kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.031633 master-0 kubenswrapper[7508]: I0313 10:36:42.031520 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.031633 master-0 kubenswrapper[7508]: I0313 10:36:42.031560 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.047111 master-0 kubenswrapper[7508]: I0313 10:36:42.033900 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.047111 master-0 kubenswrapper[7508]: I0313 10:36:42.034895 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.047111 master-0 kubenswrapper[7508]: I0313 10:36:42.041473 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.047752 master-0 kubenswrapper[7508]: I0313 10:36:42.047637 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.178153 master-0 kubenswrapper[7508]: I0313 10:36:42.178106 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9szlb\" (UniqueName: \"kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb\") pod \"controller-manager-5997c88d95-94gwc\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:42.384786 master-0 kubenswrapper[7508]: I0313 10:36:42.384736 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:36:43.192377 master-0 kubenswrapper[7508]: I0313 10:36:43.192236 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" event={"ID":"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c","Type":"ContainerStarted","Data":"e10b8dc5da77f02b966d2a4bc5f393d98d92a2cd51a02f0fa6d43be469f40a62"} Mar 13 10:36:46.085375 master-0 kubenswrapper[7508]: I0313 10:36:46.085237 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:36:46.086523 master-0 kubenswrapper[7508]: I0313 10:36:46.086429 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:36:46.223135 master-0 kubenswrapper[7508]: I0313 10:36:46.222870 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 10:36:46.224697 master-0 kubenswrapper[7508]: I0313 10:36:46.224623 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.227348 master-0 kubenswrapper[7508]: I0313 10:36:46.227298 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 10:36:46.232750 master-0 kubenswrapper[7508]: I0313 10:36:46.232173 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 10:36:46.383143 master-0 kubenswrapper[7508]: I0313 10:36:46.383075 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.383699 master-0 kubenswrapper[7508]: I0313 10:36:46.383159 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.383699 master-0 kubenswrapper[7508]: I0313 10:36:46.383201 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.546132 master-0 kubenswrapper[7508]: I0313 10:36:46.546025 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.546429 master-0 kubenswrapper[7508]: I0313 10:36:46.546201 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.546429 master-0 kubenswrapper[7508]: I0313 10:36:46.546234 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.546429 master-0 kubenswrapper[7508]: I0313 10:36:46.546355 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.546429 master-0 kubenswrapper[7508]: I0313 10:36:46.546408 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.567730 master-0 kubenswrapper[7508]: I0313 10:36:46.567651 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access\") pod \"installer-1-master-0\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:46.578178 master-0 kubenswrapper[7508]: I0313 10:36:46.577650 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 10:36:50.179885 master-0 kubenswrapper[7508]: I0313 10:36:50.179827 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:36:50.180533 master-0 kubenswrapper[7508]: I0313 10:36:50.180499 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.182647 master-0 kubenswrapper[7508]: I0313 10:36:50.182583 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 10:36:50.242224 master-0 kubenswrapper[7508]: I0313 10:36:50.241502 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:36:50.451404 master-0 kubenswrapper[7508]: I0313 10:36:50.451281 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.451404 master-0 kubenswrapper[7508]: I0313 10:36:50.451367 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.451404 master-0 kubenswrapper[7508]: I0313 10:36:50.451405 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.552901 master-0 kubenswrapper[7508]: I0313 10:36:50.552780 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.552901 master-0 kubenswrapper[7508]: I0313 10:36:50.552925 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.553261 master-0 kubenswrapper[7508]: I0313 10:36:50.552988 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.553261 master-0 kubenswrapper[7508]: I0313 10:36:50.553074 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.553261 master-0 kubenswrapper[7508]: I0313 10:36:50.553214 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.694947 master-0 kubenswrapper[7508]: I0313 10:36:50.694354 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access\") pod \"installer-1-master-0\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:50.811560 master-0 kubenswrapper[7508]: I0313 10:36:50.811319 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:36:52.806513 master-0 kubenswrapper[7508]: I0313 10:36:52.804309 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:53.087150 master-0 kubenswrapper[7508]: I0313 10:36:53.086470 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:53.127518 master-0 kubenswrapper[7508]: W0313 10:36:53.127463 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podafb27f7f_4a24_404a_b9cd_206a1c33eb3c.slice/crio-02bfbb38ca9efee460deebc093497c9b41495da7bacedfeea04543e2228d1f68 WatchSource:0}: Error finding container 02bfbb38ca9efee460deebc093497c9b41495da7bacedfeea04543e2228d1f68: Status 404 returned error can't find the container with id 02bfbb38ca9efee460deebc093497c9b41495da7bacedfeea04543e2228d1f68 Mar 13 10:36:53.137229 master-0 kubenswrapper[7508]: I0313 10:36:53.136748 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:36:53.195194 master-0 kubenswrapper[7508]: I0313 10:36:53.195132 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-999d99f5f-hlk52"] Mar 13 10:36:53.204299 master-0 kubenswrapper[7508]: I0313 10:36:53.204247 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:36:53.284148 master-0 kubenswrapper[7508]: I0313 10:36:53.284071 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 13 10:36:53.296390 master-0 kubenswrapper[7508]: I0313 10:36:53.296254 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerStarted","Data":"4a9a41f76fe188e7c2fc303922714d8a4a4540bbc426c47477e0dbcbe14a461c"} Mar 13 10:36:53.299629 master-0 kubenswrapper[7508]: W0313 10:36:53.299516 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod994d29a3_98d8_45bd_8922_adcdc899b632.slice/crio-86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8 WatchSource:0}: Error finding container 86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8: Status 404 returned error can't find the container with id 86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8 Mar 13 10:36:53.303474 master-0 kubenswrapper[7508]: I0313 10:36:53.302518 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" event={"ID":"024d9bd3-ac77-4257-9808-7518f2a73e11","Type":"ContainerStarted","Data":"b9231178429930d79290e4d816cda8b0b95b77b22b615d27922f30211e7570b4"} Mar 13 10:36:53.303474 master-0 kubenswrapper[7508]: I0313 10:36:53.303228 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:53.310639 master-0 kubenswrapper[7508]: I0313 10:36:53.310388 7508 patch_prober.go:28] interesting pod/olm-operator-d64cfc9db-h46sf container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" start-of-body= Mar 13 10:36:53.310639 master-0 kubenswrapper[7508]: I0313 10:36:53.310465 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" podUID="024d9bd3-ac77-4257-9808-7518f2a73e11" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.11:8443/healthz\": dial tcp 10.128.0.11:8443: connect: connection refused" Mar 13 10:36:53.312886 master-0 kubenswrapper[7508]: I0313 10:36:53.312826 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"c7526d564e3a6f102aadf838e6bbb178d8da329a07b6933f64af4c716253d4e9"} Mar 13 10:36:53.312886 master-0 kubenswrapper[7508]: I0313 10:36:53.312876 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"e67aecaee25884cdb16b8c22b14a8ace233f4db4719fca42142020ecb4c32d8c"} Mar 13 10:36:53.329052 master-0 kubenswrapper[7508]: I0313 10:36:53.328915 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d04e4749-2b79-49e2-a451-a2733443a913","Type":"ContainerStarted","Data":"f1b14a227c8bc8b981f29cfb4546b0b823b1f503f3e6f7c9e6a036205e1e83ce"} Mar 13 10:36:53.341565 master-0 kubenswrapper[7508]: I0313 10:36:53.341316 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" event={"ID":"a855e43e-f243-4397-a92f-60285f679eee","Type":"ContainerStarted","Data":"942ce849bf198647449825e7f0abf9526ab36c0255252e914e77dd2cc7aba0c5"} Mar 13 10:36:53.362948 master-0 kubenswrapper[7508]: I0313 10:36:53.362851 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerStarted","Data":"2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3"} Mar 13 10:36:53.364247 master-0 kubenswrapper[7508]: I0313 10:36:53.363652 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" podStartSLOduration=15.226355282 podStartE2EDuration="31.363627484s" podCreationTimestamp="2026-03-13 10:36:22 +0000 UTC" firstStartedPulling="2026-03-13 10:36:36.453203217 +0000 UTC m=+35.196028334" lastFinishedPulling="2026-03-13 10:36:52.590475419 +0000 UTC m=+51.333300536" observedRunningTime="2026-03-13 10:36:53.362606547 +0000 UTC m=+52.105431684" watchObservedRunningTime="2026-03-13 10:36:53.363627484 +0000 UTC m=+52.106452601" Mar 13 10:36:53.370614 master-0 kubenswrapper[7508]: I0313 10:36:53.369468 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f"} Mar 13 10:36:53.370614 master-0 kubenswrapper[7508]: I0313 10:36:53.369999 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:53.374740 master-0 kubenswrapper[7508]: I0313 10:36:53.374427 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:36:53.374740 master-0 kubenswrapper[7508]: I0313 10:36:53.374493 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:36:53.381368 master-0 kubenswrapper[7508]: I0313 10:36:53.380053 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"afb27f7f-4a24-404a-b9cd-206a1c33eb3c","Type":"ContainerStarted","Data":"02bfbb38ca9efee460deebc093497c9b41495da7bacedfeea04543e2228d1f68"} Mar 13 10:36:53.406069 master-0 kubenswrapper[7508]: I0313 10:36:53.401721 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"804af0f197810492da9674aa46937e1801ae5a14e02f73596e29a002fe9774f2"} Mar 13 10:36:53.406069 master-0 kubenswrapper[7508]: I0313 10:36:53.403450 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:36:53.423378 master-0 kubenswrapper[7508]: I0313 10:36:53.417520 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" event={"ID":"03b97fde-467c-46f0-95f9-9c3820b4d790","Type":"ContainerStarted","Data":"466ebfbc9d939fb59cc09aed7d0174adb466a23bb438e923666b0bfead02089f"} Mar 13 10:36:53.423378 master-0 kubenswrapper[7508]: I0313 10:36:53.417888 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:53.434540 master-0 kubenswrapper[7508]: I0313 10:36:53.434433 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" event={"ID":"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c","Type":"ContainerStarted","Data":"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050"} Mar 13 10:36:53.434540 master-0 kubenswrapper[7508]: I0313 10:36:53.434538 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:53.440174 master-0 kubenswrapper[7508]: I0313 10:36:53.440080 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:36:53.459619 master-0 kubenswrapper[7508]: I0313 10:36:53.459530 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" event={"ID":"17b956d3-c046-4f26-8be2-718c165a3acc","Type":"ContainerStarted","Data":"3fd3883c8b186f065fdd7d04082a866d1d3335a481da8ad2d7fb2179391a51ba"} Mar 13 10:36:53.477481 master-0 kubenswrapper[7508]: I0313 10:36:53.477404 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"17b0b898af4d319cde841f52d81826f891aba80f9ced795a7749caba01d53d9e"} Mar 13 10:36:53.595203 master-0 kubenswrapper[7508]: I0313 10:36:53.595052 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" podStartSLOduration=6.429419886 podStartE2EDuration="16.595033309s" podCreationTimestamp="2026-03-13 10:36:37 +0000 UTC" firstStartedPulling="2026-03-13 10:36:42.518900808 +0000 UTC m=+41.261725925" lastFinishedPulling="2026-03-13 10:36:52.684514231 +0000 UTC m=+51.427339348" observedRunningTime="2026-03-13 10:36:53.545622361 +0000 UTC m=+52.288447478" watchObservedRunningTime="2026-03-13 10:36:53.595033309 +0000 UTC m=+52.337858426" Mar 13 10:36:53.661521 master-0 kubenswrapper[7508]: I0313 10:36:53.661399 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:36:54.082409 master-0 kubenswrapper[7508]: I0313 10:36:54.080572 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm"] Mar 13 10:36:54.082409 master-0 kubenswrapper[7508]: I0313 10:36:54.080876 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" podUID="b04498f0-5a3f-4461-aecb-50304662d854" containerName="cluster-version-operator" containerID="cri-o://3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc" gracePeriod=130 Mar 13 10:36:54.261164 master-0 kubenswrapper[7508]: I0313 10:36:54.261089 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:54.405447 master-0 kubenswrapper[7508]: I0313 10:36:54.405394 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") pod \"b04498f0-5a3f-4461-aecb-50304662d854\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " Mar 13 10:36:54.405709 master-0 kubenswrapper[7508]: I0313 10:36:54.405472 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") pod \"b04498f0-5a3f-4461-aecb-50304662d854\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " Mar 13 10:36:54.405709 master-0 kubenswrapper[7508]: I0313 10:36:54.405510 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") pod \"b04498f0-5a3f-4461-aecb-50304662d854\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " Mar 13 10:36:54.405709 master-0 kubenswrapper[7508]: I0313 10:36:54.405545 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") pod \"b04498f0-5a3f-4461-aecb-50304662d854\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " Mar 13 10:36:54.405709 master-0 kubenswrapper[7508]: I0313 10:36:54.405600 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") pod \"b04498f0-5a3f-4461-aecb-50304662d854\" (UID: \"b04498f0-5a3f-4461-aecb-50304662d854\") " Mar 13 10:36:54.406122 master-0 kubenswrapper[7508]: I0313 10:36:54.405787 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "b04498f0-5a3f-4461-aecb-50304662d854" (UID: "b04498f0-5a3f-4461-aecb-50304662d854"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:54.406122 master-0 kubenswrapper[7508]: I0313 10:36:54.405826 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "b04498f0-5a3f-4461-aecb-50304662d854" (UID: "b04498f0-5a3f-4461-aecb-50304662d854"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:54.408644 master-0 kubenswrapper[7508]: I0313 10:36:54.407139 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca" (OuterVolumeSpecName: "service-ca") pod "b04498f0-5a3f-4461-aecb-50304662d854" (UID: "b04498f0-5a3f-4461-aecb-50304662d854"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.413119 master-0 kubenswrapper[7508]: I0313 10:36:54.411175 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b04498f0-5a3f-4461-aecb-50304662d854" (UID: "b04498f0-5a3f-4461-aecb-50304662d854"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:54.422199 master-0 kubenswrapper[7508]: I0313 10:36:54.416102 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b04498f0-5a3f-4461-aecb-50304662d854" (UID: "b04498f0-5a3f-4461-aecb-50304662d854"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:54.486714 master-0 kubenswrapper[7508]: I0313 10:36:54.486499 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerStarted","Data":"ccc3b2c6e99cb63369120234f78e03c40f7502629397be2489760d94a1bdc974"} Mar 13 10:36:54.486714 master-0 kubenswrapper[7508]: I0313 10:36:54.486567 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerStarted","Data":"86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8"} Mar 13 10:36:54.492541 master-0 kubenswrapper[7508]: I0313 10:36:54.491836 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"d7182bbcf3b04cf73af9cfa3d474e42048ccc1adcf54c50ae6cfbada1d1719cb"} Mar 13 10:36:54.496277 master-0 kubenswrapper[7508]: I0313 10:36:54.496234 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qt95m" event={"ID":"3e15f776-d153-4289-91c7-893584104185","Type":"ContainerStarted","Data":"2528a54a7759acfdc3fe126441a21f19149e533f1dbaa637ff6c6a614cf49d43"} Mar 13 10:36:54.496372 master-0 kubenswrapper[7508]: I0313 10:36:54.496286 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-qt95m" event={"ID":"3e15f776-d153-4289-91c7-893584104185","Type":"ContainerStarted","Data":"93c03d588a52c02f24836e7f6d43d2d3736701ecba4e7a62bd725be4b7b6fd4c"} Mar 13 10:36:54.496406 master-0 kubenswrapper[7508]: I0313 10:36:54.496388 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qt95m" Mar 13 10:36:54.498144 master-0 kubenswrapper[7508]: I0313 10:36:54.497972 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d04e4749-2b79-49e2-a451-a2733443a913","Type":"ContainerStarted","Data":"6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8"} Mar 13 10:36:54.500294 master-0 kubenswrapper[7508]: I0313 10:36:54.500139 7508 generic.go:334] "Generic (PLEG): container finished" podID="742db892-aef7-428e-b0c0-b54c6c9bf48e" containerID="f3bca609c73534ebcfd104c88e8548957e5b2088adf9149ab9ef045bc0dd24ff" exitCode=0 Mar 13 10:36:54.500294 master-0 kubenswrapper[7508]: I0313 10:36:54.500220 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-567956995b-dmf5x" event={"ID":"742db892-aef7-428e-b0c0-b54c6c9bf48e","Type":"ContainerDied","Data":"f3bca609c73534ebcfd104c88e8548957e5b2088adf9149ab9ef045bc0dd24ff"} Mar 13 10:36:54.503433 master-0 kubenswrapper[7508]: I0313 10:36:54.503409 7508 generic.go:334] "Generic (PLEG): container finished" podID="b04498f0-5a3f-4461-aecb-50304662d854" containerID="3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc" exitCode=0 Mar 13 10:36:54.503499 master-0 kubenswrapper[7508]: I0313 10:36:54.503469 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" event={"ID":"b04498f0-5a3f-4461-aecb-50304662d854","Type":"ContainerDied","Data":"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc"} Mar 13 10:36:54.503499 master-0 kubenswrapper[7508]: I0313 10:36:54.503488 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" Mar 13 10:36:54.503561 master-0 kubenswrapper[7508]: I0313 10:36:54.503539 7508 scope.go:117] "RemoveContainer" containerID="3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc" Mar 13 10:36:54.504604 master-0 kubenswrapper[7508]: I0313 10:36:54.503493 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm" event={"ID":"b04498f0-5a3f-4461-aecb-50304662d854","Type":"ContainerDied","Data":"ca676bc5b7f3f8bc75644ceb62fe29437e3a8b2aa60b785e14180ce2eda8836e"} Mar 13 10:36:54.506156 master-0 kubenswrapper[7508]: I0313 10:36:54.505226 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=8.505217957 podStartE2EDuration="8.505217957s" podCreationTimestamp="2026-03-13 10:36:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:54.504122749 +0000 UTC m=+53.246947866" watchObservedRunningTime="2026-03-13 10:36:54.505217957 +0000 UTC m=+53.248043064" Mar 13 10:36:54.520474 master-0 kubenswrapper[7508]: I0313 10:36:54.515039 7508 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.520474 master-0 kubenswrapper[7508]: I0313 10:36:54.515072 7508 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b04498f0-5a3f-4461-aecb-50304662d854-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.520474 master-0 kubenswrapper[7508]: I0313 10:36:54.515082 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b04498f0-5a3f-4461-aecb-50304662d854-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.520474 master-0 kubenswrapper[7508]: I0313 10:36:54.515104 7508 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b04498f0-5a3f-4461-aecb-50304662d854-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.520474 master-0 kubenswrapper[7508]: I0313 10:36:54.515113 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b04498f0-5a3f-4461-aecb-50304662d854-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.524645 master-0 kubenswrapper[7508]: I0313 10:36:54.524589 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerStarted","Data":"a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4"} Mar 13 10:36:54.526852 master-0 kubenswrapper[7508]: I0313 10:36:54.526821 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" containerName="installer" containerID="cri-o://0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e" gracePeriod=30 Mar 13 10:36:54.527254 master-0 kubenswrapper[7508]: I0313 10:36:54.527234 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"afb27f7f-4a24-404a-b9cd-206a1c33eb3c","Type":"ContainerStarted","Data":"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e"} Mar 13 10:36:54.529750 master-0 kubenswrapper[7508]: I0313 10:36:54.529714 7508 scope.go:117] "RemoveContainer" containerID="3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc" Mar 13 10:36:54.531830 master-0 kubenswrapper[7508]: E0313 10:36:54.530695 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc\": container with ID starting with 3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc not found: ID does not exist" containerID="3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc" Mar 13 10:36:54.531830 master-0 kubenswrapper[7508]: I0313 10:36:54.530776 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc"} err="failed to get container status \"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc\": rpc error: code = NotFound desc = could not find container \"3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc\": container with ID starting with 3276a7074c2a48542b75700dc9f0c250b15ab83b0427d08fa18f08f9452482bc not found: ID does not exist" Mar 13 10:36:54.537144 master-0 kubenswrapper[7508]: I0313 10:36:54.537058 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:36:54.543303 master-0 kubenswrapper[7508]: I0313 10:36:54.543246 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:36:54.605231 master-0 kubenswrapper[7508]: I0313 10:36:54.605154 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=4.605133833 podStartE2EDuration="4.605133833s" podCreationTimestamp="2026-03-13 10:36:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:54.603764058 +0000 UTC m=+53.346589175" watchObservedRunningTime="2026-03-13 10:36:54.605133833 +0000 UTC m=+53.347958950" Mar 13 10:36:54.639108 master-0 kubenswrapper[7508]: I0313 10:36:54.639043 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:36:54.639719 master-0 kubenswrapper[7508]: E0313 10:36:54.639569 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b04498f0-5a3f-4461-aecb-50304662d854" containerName="cluster-version-operator" Mar 13 10:36:54.639876 master-0 kubenswrapper[7508]: I0313 10:36:54.639860 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="b04498f0-5a3f-4461-aecb-50304662d854" containerName="cluster-version-operator" Mar 13 10:36:54.640119 master-0 kubenswrapper[7508]: I0313 10:36:54.640076 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="b04498f0-5a3f-4461-aecb-50304662d854" containerName="cluster-version-operator" Mar 13 10:36:54.640972 master-0 kubenswrapper[7508]: I0313 10:36:54.640955 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.646447 master-0 kubenswrapper[7508]: I0313 10:36:54.646313 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-qt95m" podStartSLOduration=3.500665421 podStartE2EDuration="16.646285427s" podCreationTimestamp="2026-03-13 10:36:38 +0000 UTC" firstStartedPulling="2026-03-13 10:36:39.566482295 +0000 UTC m=+38.309307412" lastFinishedPulling="2026-03-13 10:36:52.712102221 +0000 UTC m=+51.454927418" observedRunningTime="2026-03-13 10:36:54.638722019 +0000 UTC m=+53.381547156" watchObservedRunningTime="2026-03-13 10:36:54.646285427 +0000 UTC m=+53.389110544" Mar 13 10:36:54.653948 master-0 kubenswrapper[7508]: I0313 10:36:54.653901 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:36:54.733191 master-0 kubenswrapper[7508]: I0313 10:36:54.731520 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.733191 master-0 kubenswrapper[7508]: I0313 10:36:54.731572 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gk52d\" (UniqueName: \"kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.733191 master-0 kubenswrapper[7508]: I0313 10:36:54.731598 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.750003 master-0 kubenswrapper[7508]: I0313 10:36:54.747792 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:36:54.750003 master-0 kubenswrapper[7508]: I0313 10:36:54.749084 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.766685 master-0 kubenswrapper[7508]: I0313 10:36:54.764180 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:36:54.774488 master-0 kubenswrapper[7508]: I0313 10:36:54.774420 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=14.774395918 podStartE2EDuration="14.774395918s" podCreationTimestamp="2026-03-13 10:36:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:54.771196845 +0000 UTC m=+53.514021972" watchObservedRunningTime="2026-03-13 10:36:54.774395918 +0000 UTC m=+53.517221035" Mar 13 10:36:54.800303 master-0 kubenswrapper[7508]: I0313 10:36:54.798431 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm"] Mar 13 10:36:54.805829 master-0 kubenswrapper[7508]: I0313 10:36:54.805796 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:54.813863 master-0 kubenswrapper[7508]: I0313 10:36:54.812316 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-wlkwm"] Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.831979 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832061 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm5z8\" (UniqueName: \"kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832184 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832217 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832250 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832274 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832317 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832391 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832432 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832476 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832503 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets\") pod \"742db892-aef7-428e-b0c0-b54c6c9bf48e\" (UID: \"742db892-aef7-428e-b0c0-b54c6c9bf48e\") " Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832652 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ps4tc\" (UniqueName: \"kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832689 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832739 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832782 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832805 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gk52d\" (UniqueName: \"kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.832839 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.833725 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.834412 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit" (OuterVolumeSpecName: "audit") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.834468 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.834848 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.835201 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.835221 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.835328 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:54.835850 master-0 kubenswrapper[7508]: I0313 10:36:54.835413 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.842427 master-0 kubenswrapper[7508]: I0313 10:36:54.840530 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config" (OuterVolumeSpecName: "config") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:36:54.842769 master-0 kubenswrapper[7508]: I0313 10:36:54.842741 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:54.844904 master-0 kubenswrapper[7508]: I0313 10:36:54.843441 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:54.844904 master-0 kubenswrapper[7508]: I0313 10:36:54.843673 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8" (OuterVolumeSpecName: "kube-api-access-gm5z8") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "kube-api-access-gm5z8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:54.844904 master-0 kubenswrapper[7508]: I0313 10:36:54.843807 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "742db892-aef7-428e-b0c0-b54c6c9bf48e" (UID: "742db892-aef7-428e-b0c0-b54c6c9bf48e"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:36:54.859634 master-0 kubenswrapper[7508]: I0313 10:36:54.853028 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2"] Mar 13 10:36:54.859634 master-0 kubenswrapper[7508]: E0313 10:36:54.853253 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="742db892-aef7-428e-b0c0-b54c6c9bf48e" containerName="fix-audit-permissions" Mar 13 10:36:54.859634 master-0 kubenswrapper[7508]: I0313 10:36:54.853267 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="742db892-aef7-428e-b0c0-b54c6c9bf48e" containerName="fix-audit-permissions" Mar 13 10:36:54.859634 master-0 kubenswrapper[7508]: I0313 10:36:54.853841 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="742db892-aef7-428e-b0c0-b54c6c9bf48e" containerName="fix-audit-permissions" Mar 13 10:36:54.859634 master-0 kubenswrapper[7508]: I0313 10:36:54.854202 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.859990 master-0 kubenswrapper[7508]: I0313 10:36:54.859937 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:36:54.860649 master-0 kubenswrapper[7508]: I0313 10:36:54.860177 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:36:54.860950 master-0 kubenswrapper[7508]: I0313 10:36:54.860932 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:36:54.879560 master-0 kubenswrapper[7508]: I0313 10:36:54.879520 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gk52d\" (UniqueName: \"kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d\") pod \"redhat-marketplace-cbmz8\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:54.934389 master-0 kubenswrapper[7508]: I0313 10:36:54.934320 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ps4tc\" (UniqueName: \"kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.934389 master-0 kubenswrapper[7508]: I0313 10:36:54.934376 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.935011 master-0 kubenswrapper[7508]: I0313 10:36:54.934959 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.935262 master-0 kubenswrapper[7508]: I0313 10:36:54.935243 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.935462 master-0 kubenswrapper[7508]: I0313 10:36:54.935417 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.935635 master-0 kubenswrapper[7508]: I0313 10:36:54.935609 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.935717 master-0 kubenswrapper[7508]: I0313 10:36:54.935619 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.935842 master-0 kubenswrapper[7508]: I0313 10:36:54.935822 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.935970 master-0 kubenswrapper[7508]: I0313 10:36:54.935945 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:54.936117 master-0 kubenswrapper[7508]: I0313 10:36:54.936068 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:54.936240 master-0 kubenswrapper[7508]: I0313 10:36:54.936203 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm5z8\" (UniqueName: \"kubernetes.io/projected/742db892-aef7-428e-b0c0-b54c6c9bf48e-kube-api-access-gm5z8\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936240 master-0 kubenswrapper[7508]: I0313 10:36:54.936220 7508 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936240 master-0 kubenswrapper[7508]: I0313 10:36:54.936230 7508 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936240 master-0 kubenswrapper[7508]: I0313 10:36:54.936239 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936247 7508 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936256 7508 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936264 7508 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936275 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/742db892-aef7-428e-b0c0-b54c6c9bf48e-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936285 7508 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936294 7508 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/742db892-aef7-428e-b0c0-b54c6c9bf48e-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.936358 master-0 kubenswrapper[7508]: I0313 10:36:54.936303 7508 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/742db892-aef7-428e-b0c0-b54c6c9bf48e-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:54.956082 master-0 kubenswrapper[7508]: I0313 10:36:54.956031 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ps4tc\" (UniqueName: \"kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc\") pod \"redhat-operators-wlpwf\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:55.019057 master-0 kubenswrapper[7508]: I0313 10:36:55.018909 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_afb27f7f-4a24-404a-b9cd-206a1c33eb3c/installer/0.log" Mar 13 10:36:55.019057 master-0 kubenswrapper[7508]: I0313 10:36:55.018989 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.040758 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock\") pod \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.040901 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access\") pod \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.040997 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir\") pod \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\" (UID: \"afb27f7f-4a24-404a-b9cd-206a1c33eb3c\") " Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.041267 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.041304 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.041321 master-0 kubenswrapper[7508]: I0313 10:36:55.041334 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.041859 master-0 kubenswrapper[7508]: I0313 10:36:55.041379 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.041859 master-0 kubenswrapper[7508]: I0313 10:36:55.041410 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.043599 master-0 kubenswrapper[7508]: I0313 10:36:55.043543 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.043677 master-0 kubenswrapper[7508]: I0313 10:36:55.043627 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.043735 master-0 kubenswrapper[7508]: I0313 10:36:55.043685 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.044202 master-0 kubenswrapper[7508]: I0313 10:36:55.044152 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock" (OuterVolumeSpecName: "var-lock") pod "afb27f7f-4a24-404a-b9cd-206a1c33eb3c" (UID: "afb27f7f-4a24-404a-b9cd-206a1c33eb3c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:55.044311 master-0 kubenswrapper[7508]: I0313 10:36:55.044188 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "afb27f7f-4a24-404a-b9cd-206a1c33eb3c" (UID: "afb27f7f-4a24-404a-b9cd-206a1c33eb3c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:36:55.047979 master-0 kubenswrapper[7508]: I0313 10:36:55.047917 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.048345 master-0 kubenswrapper[7508]: I0313 10:36:55.047956 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "afb27f7f-4a24-404a-b9cd-206a1c33eb3c" (UID: "afb27f7f-4a24-404a-b9cd-206a1c33eb3c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:36:55.050287 master-0 kubenswrapper[7508]: I0313 10:36:55.050230 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:36:55.062054 master-0 kubenswrapper[7508]: I0313 10:36:55.062019 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.081259 master-0 kubenswrapper[7508]: I0313 10:36:55.081197 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:36:55.145651 master-0 kubenswrapper[7508]: I0313 10:36:55.145615 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:55.145651 master-0 kubenswrapper[7508]: I0313 10:36:55.145648 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:55.146125 master-0 kubenswrapper[7508]: I0313 10:36:55.145659 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/afb27f7f-4a24-404a-b9cd-206a1c33eb3c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:36:55.212397 master-0 kubenswrapper[7508]: I0313 10:36:55.212351 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:36:55.227785 master-0 kubenswrapper[7508]: W0313 10:36:55.227688 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5ed7aff_47c0_42f3_9a26_9385d2bde582.slice/crio-3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919 WatchSource:0}: Error finding container 3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919: Status 404 returned error can't find the container with id 3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919 Mar 13 10:36:55.470078 master-0 kubenswrapper[7508]: I0313 10:36:55.470020 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:36:55.481572 master-0 kubenswrapper[7508]: W0313 10:36:55.481508 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b072636_e46b_47f6_af85_3210e62bbd2d.slice/crio-7cc1254634b76ed578784e1acad8caa82c7b3cc646671ebb3835307434f88d23 WatchSource:0}: Error finding container 7cc1254634b76ed578784e1acad8caa82c7b3cc646671ebb3835307434f88d23: Status 404 returned error can't find the container with id 7cc1254634b76ed578784e1acad8caa82c7b3cc646671ebb3835307434f88d23 Mar 13 10:36:55.508551 master-0 kubenswrapper[7508]: I0313 10:36:55.507653 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b04498f0-5a3f-4461-aecb-50304662d854" path="/var/lib/kubelet/pods/b04498f0-5a3f-4461-aecb-50304662d854/volumes" Mar 13 10:36:55.559361 master-0 kubenswrapper[7508]: I0313 10:36:55.559327 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_afb27f7f-4a24-404a-b9cd-206a1c33eb3c/installer/0.log" Mar 13 10:36:55.559452 master-0 kubenswrapper[7508]: I0313 10:36:55.559387 7508 generic.go:334] "Generic (PLEG): container finished" podID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" containerID="0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e" exitCode=1 Mar 13 10:36:55.559522 master-0 kubenswrapper[7508]: I0313 10:36:55.559504 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 13 10:36:55.560024 master-0 kubenswrapper[7508]: I0313 10:36:55.559952 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"afb27f7f-4a24-404a-b9cd-206a1c33eb3c","Type":"ContainerDied","Data":"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e"} Mar 13 10:36:55.560136 master-0 kubenswrapper[7508]: I0313 10:36:55.560045 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"afb27f7f-4a24-404a-b9cd-206a1c33eb3c","Type":"ContainerDied","Data":"02bfbb38ca9efee460deebc093497c9b41495da7bacedfeea04543e2228d1f68"} Mar 13 10:36:55.560177 master-0 kubenswrapper[7508]: I0313 10:36:55.560144 7508 scope.go:117] "RemoveContainer" containerID="0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e" Mar 13 10:36:55.560806 master-0 kubenswrapper[7508]: I0313 10:36:55.560776 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:36:55.562359 master-0 kubenswrapper[7508]: I0313 10:36:55.562329 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" event={"ID":"b5ed7aff-47c0-42f3-9a26-9385d2bde582","Type":"ContainerStarted","Data":"01cb1eb4cd1847633cd84937df690f5742decd6c5d3c9b634653c1fb9ee3bc43"} Mar 13 10:36:55.562439 master-0 kubenswrapper[7508]: I0313 10:36:55.562364 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" event={"ID":"b5ed7aff-47c0-42f3-9a26-9385d2bde582","Type":"ContainerStarted","Data":"3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919"} Mar 13 10:36:55.563958 master-0 kubenswrapper[7508]: I0313 10:36:55.563878 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbmz8" event={"ID":"1b072636-e46b-47f6-af85-3210e62bbd2d","Type":"ContainerStarted","Data":"7cc1254634b76ed578784e1acad8caa82c7b3cc646671ebb3835307434f88d23"} Mar 13 10:36:55.565404 master-0 kubenswrapper[7508]: I0313 10:36:55.565352 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-567956995b-dmf5x" event={"ID":"742db892-aef7-428e-b0c0-b54c6c9bf48e","Type":"ContainerDied","Data":"cbaf41dab9a2ee348b55ecc6df287959a27f53d4c1058f3224a3b3927f09019e"} Mar 13 10:36:55.565478 master-0 kubenswrapper[7508]: I0313 10:36:55.565425 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-567956995b-dmf5x" Mar 13 10:36:55.573910 master-0 kubenswrapper[7508]: W0313 10:36:55.573847 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20df9416_90f4_4c21_a3bc_c6e5f6622e15.slice/crio-6962fe374e095b27f0316a8835e65d00f53256a1d8e385e98ec2f5caea44bbb2 WatchSource:0}: Error finding container 6962fe374e095b27f0316a8835e65d00f53256a1d8e385e98ec2f5caea44bbb2: Status 404 returned error can't find the container with id 6962fe374e095b27f0316a8835e65d00f53256a1d8e385e98ec2f5caea44bbb2 Mar 13 10:36:55.589075 master-0 kubenswrapper[7508]: I0313 10:36:55.586627 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" podStartSLOduration=1.586607222 podStartE2EDuration="1.586607222s" podCreationTimestamp="2026-03-13 10:36:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:36:55.585900713 +0000 UTC m=+54.328725830" watchObservedRunningTime="2026-03-13 10:36:55.586607222 +0000 UTC m=+54.329432339" Mar 13 10:36:55.591348 master-0 kubenswrapper[7508]: I0313 10:36:55.591216 7508 scope.go:117] "RemoveContainer" containerID="0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e" Mar 13 10:36:55.592080 master-0 kubenswrapper[7508]: E0313 10:36:55.591981 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e\": container with ID starting with 0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e not found: ID does not exist" containerID="0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e" Mar 13 10:36:55.592080 master-0 kubenswrapper[7508]: I0313 10:36:55.592011 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e"} err="failed to get container status \"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e\": rpc error: code = NotFound desc = could not find container \"0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e\": container with ID starting with 0dd857b51303d9ca30c93bd2313eea0afd128b9b80aac34dcc1325581eaf6c4e not found: ID does not exist" Mar 13 10:36:55.592080 master-0 kubenswrapper[7508]: I0313 10:36:55.592031 7508 scope.go:117] "RemoveContainer" containerID="f3bca609c73534ebcfd104c88e8548957e5b2088adf9149ab9ef045bc0dd24ff" Mar 13 10:36:55.612729 master-0 kubenswrapper[7508]: I0313 10:36:55.612649 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:36:55.613135 master-0 kubenswrapper[7508]: E0313 10:36:55.612885 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" containerName="installer" Mar 13 10:36:55.613418 master-0 kubenswrapper[7508]: I0313 10:36:55.613087 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" containerName="installer" Mar 13 10:36:55.613613 master-0 kubenswrapper[7508]: I0313 10:36:55.613529 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" containerName="installer" Mar 13 10:36:55.614162 master-0 kubenswrapper[7508]: I0313 10:36:55.614134 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.624880 master-0 kubenswrapper[7508]: I0313 10:36:55.624787 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:36:55.625273 master-0 kubenswrapper[7508]: I0313 10:36:55.625256 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:55.647740 master-0 kubenswrapper[7508]: I0313 10:36:55.647687 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 13 10:36:55.657528 master-0 kubenswrapper[7508]: I0313 10:36:55.657486 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.657806 master-0 kubenswrapper[7508]: I0313 10:36:55.657579 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.657806 master-0 kubenswrapper[7508]: I0313 10:36:55.657641 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.699004 master-0 kubenswrapper[7508]: I0313 10:36:55.698927 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:55.712049 master-0 kubenswrapper[7508]: I0313 10:36:55.711991 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-567956995b-dmf5x"] Mar 13 10:36:55.764774 master-0 kubenswrapper[7508]: I0313 10:36:55.764712 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.765012 master-0 kubenswrapper[7508]: I0313 10:36:55.764791 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.765012 master-0 kubenswrapper[7508]: I0313 10:36:55.764821 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.765012 master-0 kubenswrapper[7508]: I0313 10:36:55.764910 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.765799 master-0 kubenswrapper[7508]: I0313 10:36:55.765312 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.793710 master-0 kubenswrapper[7508]: I0313 10:36:55.793645 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access\") pod \"installer-3-master-0\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:55.958753 master-0 kubenswrapper[7508]: I0313 10:36:55.958681 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:36:56.575578 master-0 kubenswrapper[7508]: I0313 10:36:56.575507 7508 generic.go:334] "Generic (PLEG): container finished" podID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerID="3defd4dddffc43465173ec71d65059b52bceb629972998b93ecfd713e6cbe46d" exitCode=0 Mar 13 10:36:56.576111 master-0 kubenswrapper[7508]: I0313 10:36:56.575610 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbmz8" event={"ID":"1b072636-e46b-47f6-af85-3210e62bbd2d","Type":"ContainerDied","Data":"3defd4dddffc43465173ec71d65059b52bceb629972998b93ecfd713e6cbe46d"} Mar 13 10:36:56.580230 master-0 kubenswrapper[7508]: I0313 10:36:56.580186 7508 generic.go:334] "Generic (PLEG): container finished" podID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerID="191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95" exitCode=0 Mar 13 10:36:56.580302 master-0 kubenswrapper[7508]: I0313 10:36:56.580277 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlpwf" event={"ID":"20df9416-90f4-4c21-a3bc-c6e5f6622e15","Type":"ContainerDied","Data":"191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95"} Mar 13 10:36:56.580347 master-0 kubenswrapper[7508]: I0313 10:36:56.580307 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlpwf" event={"ID":"20df9416-90f4-4c21-a3bc-c6e5f6622e15","Type":"ContainerStarted","Data":"6962fe374e095b27f0316a8835e65d00f53256a1d8e385e98ec2f5caea44bbb2"} Mar 13 10:36:56.648338 master-0 kubenswrapper[7508]: I0313 10:36:56.647927 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:36:56.654890 master-0 kubenswrapper[7508]: I0313 10:36:56.654832 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.712053 master-0 kubenswrapper[7508]: I0313 10:36:56.711251 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:36:56.773919 master-0 kubenswrapper[7508]: I0313 10:36:56.773820 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msv2k\" (UniqueName: \"kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.773919 master-0 kubenswrapper[7508]: I0313 10:36:56.773886 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.774252 master-0 kubenswrapper[7508]: I0313 10:36:56.774053 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.874891 master-0 kubenswrapper[7508]: I0313 10:36:56.874826 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msv2k\" (UniqueName: \"kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.874891 master-0 kubenswrapper[7508]: I0313 10:36:56.874888 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.875237 master-0 kubenswrapper[7508]: I0313 10:36:56.874923 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.875452 master-0 kubenswrapper[7508]: I0313 10:36:56.875412 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:56.876038 master-0 kubenswrapper[7508]: I0313 10:36:56.875998 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:57.515391 master-0 kubenswrapper[7508]: I0313 10:36:57.190523 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:36:57.531210 master-0 kubenswrapper[7508]: I0313 10:36:57.530327 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742db892-aef7-428e-b0c0-b54c6c9bf48e" path="/var/lib/kubelet/pods/742db892-aef7-428e-b0c0-b54c6c9bf48e/volumes" Mar 13 10:36:57.531210 master-0 kubenswrapper[7508]: I0313 10:36:57.531129 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afb27f7f-4a24-404a-b9cd-206a1c33eb3c" path="/var/lib/kubelet/pods/afb27f7f-4a24-404a-b9cd-206a1c33eb3c/volumes" Mar 13 10:36:59.020610 master-0 kubenswrapper[7508]: I0313 10:36:59.020526 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msv2k\" (UniqueName: \"kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k\") pod \"certified-operators-29dk6\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:59.076386 master-0 kubenswrapper[7508]: I0313 10:36:59.076304 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:36:59.189245 master-0 kubenswrapper[7508]: W0313 10:36:59.189199 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5b8b68d8_a452_45f4_aaad_3d91cfb3e298.slice/crio-80b93464f895f412d46c60dec5c28dc3ca977c744f86bc0ec48b9c7d41093db2 WatchSource:0}: Error finding container 80b93464f895f412d46c60dec5c28dc3ca977c744f86bc0ec48b9c7d41093db2: Status 404 returned error can't find the container with id 80b93464f895f412d46c60dec5c28dc3ca977c744f86bc0ec48b9c7d41093db2 Mar 13 10:36:59.603815 master-0 kubenswrapper[7508]: I0313 10:36:59.603660 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5b8b68d8-a452-45f4-aaad-3d91cfb3e298","Type":"ContainerStarted","Data":"80b93464f895f412d46c60dec5c28dc3ca977c744f86bc0ec48b9c7d41093db2"} Mar 13 10:36:59.701167 master-0 kubenswrapper[7508]: I0313 10:36:59.699251 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-576d4447f8-zqphk"] Mar 13 10:36:59.701494 master-0 kubenswrapper[7508]: I0313 10:36:59.701402 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.744669 master-0 kubenswrapper[7508]: I0313 10:36:59.744610 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:36:59.745238 master-0 kubenswrapper[7508]: I0313 10:36:59.745206 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:36:59.745583 master-0 kubenswrapper[7508]: I0313 10:36:59.745535 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:36:59.745736 master-0 kubenswrapper[7508]: I0313 10:36:59.745696 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:36:59.745899 master-0 kubenswrapper[7508]: I0313 10:36:59.745857 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:36:59.746013 master-0 kubenswrapper[7508]: I0313 10:36:59.745990 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:36:59.746215 master-0 kubenswrapper[7508]: I0313 10:36:59.746140 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:36:59.746503 master-0 kubenswrapper[7508]: I0313 10:36:59.746457 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:36:59.746873 master-0 kubenswrapper[7508]: I0313 10:36:59.746847 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:36:59.756809 master-0 kubenswrapper[7508]: I0313 10:36:59.756744 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:36:59.764121 master-0 kubenswrapper[7508]: I0313 10:36:59.764027 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764121 master-0 kubenswrapper[7508]: I0313 10:36:59.764125 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764431 master-0 kubenswrapper[7508]: I0313 10:36:59.764175 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764431 master-0 kubenswrapper[7508]: I0313 10:36:59.764210 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764431 master-0 kubenswrapper[7508]: I0313 10:36:59.764349 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764431 master-0 kubenswrapper[7508]: I0313 10:36:59.764421 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764655 master-0 kubenswrapper[7508]: I0313 10:36:59.764504 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764655 master-0 kubenswrapper[7508]: I0313 10:36:59.764535 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6lnq\" (UniqueName: \"kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764655 master-0 kubenswrapper[7508]: I0313 10:36:59.764565 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764824 master-0 kubenswrapper[7508]: I0313 10:36:59.764646 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.764824 master-0 kubenswrapper[7508]: I0313 10:36:59.764687 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865450 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865505 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6lnq\" (UniqueName: \"kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865526 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865560 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865577 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865594 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865611 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.865619 master-0 kubenswrapper[7508]: I0313 10:36:59.865631 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.866082 master-0 kubenswrapper[7508]: I0313 10:36:59.865647 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.866082 master-0 kubenswrapper[7508]: I0313 10:36:59.865663 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.866082 master-0 kubenswrapper[7508]: I0313 10:36:59.865680 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.866580 master-0 kubenswrapper[7508]: I0313 10:36:59.866516 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.866815 master-0 kubenswrapper[7508]: I0313 10:36:59.866746 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.867246 master-0 kubenswrapper[7508]: I0313 10:36:59.867218 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.867288 master-0 kubenswrapper[7508]: I0313 10:36:59.867214 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.867461 master-0 kubenswrapper[7508]: I0313 10:36:59.867426 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.867935 master-0 kubenswrapper[7508]: I0313 10:36:59.867904 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.868616 master-0 kubenswrapper[7508]: I0313 10:36:59.868587 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.868867 master-0 kubenswrapper[7508]: I0313 10:36:59.868824 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.869118 master-0 kubenswrapper[7508]: I0313 10:36:59.869067 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:36:59.869761 master-0 kubenswrapper[7508]: I0313 10:36:59.869729 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:00.611136 master-0 kubenswrapper[7508]: I0313 10:37:00.611055 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5b8b68d8-a452-45f4-aaad-3d91cfb3e298","Type":"ContainerStarted","Data":"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed"} Mar 13 10:37:00.805675 master-0 kubenswrapper[7508]: I0313 10:37:00.805621 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6lnq\" (UniqueName: \"kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:00.813289 master-0 kubenswrapper[7508]: I0313 10:37:00.812736 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:37:00.819713 master-0 kubenswrapper[7508]: I0313 10:37:00.819642 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-576d4447f8-zqphk"] Mar 13 10:37:00.820033 master-0 kubenswrapper[7508]: I0313 10:37:00.819999 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.823485 master-0 kubenswrapper[7508]: I0313 10:37:00.823408 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:37:00.897486 master-0 kubenswrapper[7508]: I0313 10:37:00.897419 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.897486 master-0 kubenswrapper[7508]: I0313 10:37:00.897494 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.897809 master-0 kubenswrapper[7508]: I0313 10:37:00.897547 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.937914 master-0 kubenswrapper[7508]: I0313 10:37:00.937835 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:00.999472 master-0 kubenswrapper[7508]: I0313 10:37:00.999373 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.999755 master-0 kubenswrapper[7508]: I0313 10:37:00.999501 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.999755 master-0 kubenswrapper[7508]: I0313 10:37:00.999557 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.999755 master-0 kubenswrapper[7508]: I0313 10:37:00.999679 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:00.999755 master-0 kubenswrapper[7508]: I0313 10:37:00.999745 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:01.442527 master-0 kubenswrapper[7508]: I0313 10:37:01.442448 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:37:01.446949 master-0 kubenswrapper[7508]: I0313 10:37:01.446868 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:37:01.717772 master-0 kubenswrapper[7508]: W0313 10:37:01.717157 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b83fd7_2b78_42a9_9d93_0be3fd59a679.slice/crio-29b996efefea62a5520da7a1bea5d4af7c6b2e7ab4bada22cac5fe5c2c1aa4be WatchSource:0}: Error finding container 29b996efefea62a5520da7a1bea5d4af7c6b2e7ab4bada22cac5fe5c2c1aa4be: Status 404 returned error can't find the container with id 29b996efefea62a5520da7a1bea5d4af7c6b2e7ab4bada22cac5fe5c2c1aa4be Mar 13 10:37:02.316984 master-0 kubenswrapper[7508]: I0313 10:37:02.316855 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:37:02.319290 master-0 kubenswrapper[7508]: I0313 10:37:02.318941 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.411947 master-0 kubenswrapper[7508]: E0313 10:37:02.408656 7508 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61b83fd7_2b78_42a9_9d93_0be3fd59a679.slice/crio-e006633100d465a175ba215dc8fcac40a2f8affb6e3cca4831dc0965bfd5291f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 10:37:02.416917 master-0 kubenswrapper[7508]: I0313 10:37:02.416853 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hmx8s\" (UniqueName: \"kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.416917 master-0 kubenswrapper[7508]: I0313 10:37:02.416918 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.417204 master-0 kubenswrapper[7508]: I0313 10:37:02.417124 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.518243 master-0 kubenswrapper[7508]: I0313 10:37:02.518074 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hmx8s\" (UniqueName: \"kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.518776 master-0 kubenswrapper[7508]: I0313 10:37:02.518651 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.519017 master-0 kubenswrapper[7508]: I0313 10:37:02.518981 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.520308 master-0 kubenswrapper[7508]: I0313 10:37:02.519675 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.520308 master-0 kubenswrapper[7508]: I0313 10:37:02.520024 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:02.623895 master-0 kubenswrapper[7508]: I0313 10:37:02.623844 7508 generic.go:334] "Generic (PLEG): container finished" podID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerID="e006633100d465a175ba215dc8fcac40a2f8affb6e3cca4831dc0965bfd5291f" exitCode=0 Mar 13 10:37:02.624144 master-0 kubenswrapper[7508]: I0313 10:37:02.623914 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29dk6" event={"ID":"61b83fd7-2b78-42a9-9d93-0be3fd59a679","Type":"ContainerDied","Data":"e006633100d465a175ba215dc8fcac40a2f8affb6e3cca4831dc0965bfd5291f"} Mar 13 10:37:02.624144 master-0 kubenswrapper[7508]: I0313 10:37:02.623958 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29dk6" event={"ID":"61b83fd7-2b78-42a9-9d93-0be3fd59a679","Type":"ContainerStarted","Data":"29b996efefea62a5520da7a1bea5d4af7c6b2e7ab4bada22cac5fe5c2c1aa4be"} Mar 13 10:37:03.106205 master-0 kubenswrapper[7508]: I0313 10:37:03.105538 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qt95m" Mar 13 10:37:03.327572 master-0 kubenswrapper[7508]: I0313 10:37:03.325812 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:37:04.324614 master-0 kubenswrapper[7508]: I0313 10:37:04.324534 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-576d4447f8-zqphk"] Mar 13 10:37:04.341778 master-0 kubenswrapper[7508]: I0313 10:37:04.341741 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:37:04.349353 master-0 kubenswrapper[7508]: I0313 10:37:04.349303 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access\") pod \"installer-1-master-0\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:04.421948 master-0 kubenswrapper[7508]: I0313 10:37:04.421900 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hmx8s\" (UniqueName: \"kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s\") pod \"community-operators-dbhll\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:04.450919 master-0 kubenswrapper[7508]: I0313 10:37:04.450841 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:04.451431 master-0 kubenswrapper[7508]: I0313 10:37:04.451392 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:04.509769 master-0 kubenswrapper[7508]: I0313 10:37:04.509674 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:37:04.510025 master-0 kubenswrapper[7508]: I0313 10:37:04.509955 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" containerName="installer" containerID="cri-o://9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed" gracePeriod=30 Mar 13 10:37:04.629083 master-0 kubenswrapper[7508]: I0313 10:37:04.626534 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=9.626498685 podStartE2EDuration="9.626498685s" podCreationTimestamp="2026-03-13 10:36:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:04.625362496 +0000 UTC m=+63.368187613" watchObservedRunningTime="2026-03-13 10:37:04.626498685 +0000 UTC m=+63.369323802" Mar 13 10:37:04.641277 master-0 kubenswrapper[7508]: I0313 10:37:04.641226 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"6d81df6e0c2c501a006e6d355e7ca64b7f375686077a624175b4786dbf2e5138"} Mar 13 10:37:04.721673 master-0 kubenswrapper[7508]: I0313 10:37:04.721539 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:37:04.782738 master-0 kubenswrapper[7508]: I0313 10:37:04.782665 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:37:04.782988 master-0 kubenswrapper[7508]: I0313 10:37:04.782963 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" podUID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" containerName="route-controller-manager" containerID="cri-o://416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050" gracePeriod=30 Mar 13 10:37:05.094962 master-0 kubenswrapper[7508]: I0313 10:37:05.094907 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_5b8b68d8-a452-45f4-aaad-3d91cfb3e298/installer/0.log" Mar 13 10:37:05.095281 master-0 kubenswrapper[7508]: I0313 10:37:05.095205 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:37:05.168172 master-0 kubenswrapper[7508]: I0313 10:37:05.168044 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir\") pod \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " Mar 13 10:37:05.168172 master-0 kubenswrapper[7508]: I0313 10:37:05.168135 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock\") pod \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " Mar 13 10:37:05.168172 master-0 kubenswrapper[7508]: I0313 10:37:05.168186 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access\") pod \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\" (UID: \"5b8b68d8-a452-45f4-aaad-3d91cfb3e298\") " Mar 13 10:37:05.168532 master-0 kubenswrapper[7508]: I0313 10:37:05.168205 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5b8b68d8-a452-45f4-aaad-3d91cfb3e298" (UID: "5b8b68d8-a452-45f4-aaad-3d91cfb3e298"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:05.168532 master-0 kubenswrapper[7508]: I0313 10:37:05.168277 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock" (OuterVolumeSpecName: "var-lock") pod "5b8b68d8-a452-45f4-aaad-3d91cfb3e298" (UID: "5b8b68d8-a452-45f4-aaad-3d91cfb3e298"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:05.168532 master-0 kubenswrapper[7508]: I0313 10:37:05.168407 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.168532 master-0 kubenswrapper[7508]: I0313 10:37:05.168421 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.172994 master-0 kubenswrapper[7508]: I0313 10:37:05.172498 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5b8b68d8-a452-45f4-aaad-3d91cfb3e298" (UID: "5b8b68d8-a452-45f4-aaad-3d91cfb3e298"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:05.236632 master-0 kubenswrapper[7508]: I0313 10:37:05.236530 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:37:05.242470 master-0 kubenswrapper[7508]: I0313 10:37:05.242248 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:37:05.283439 master-0 kubenswrapper[7508]: I0313 10:37:05.270554 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b8b68d8-a452-45f4-aaad-3d91cfb3e298-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.408310 master-0 kubenswrapper[7508]: I0313 10:37:05.408208 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:37:05.473277 master-0 kubenswrapper[7508]: I0313 10:37:05.473225 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert\") pod \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " Mar 13 10:37:05.473565 master-0 kubenswrapper[7508]: I0313 10:37:05.473516 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca\") pod \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " Mar 13 10:37:05.473995 master-0 kubenswrapper[7508]: I0313 10:37:05.473974 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjbpt\" (UniqueName: \"kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt\") pod \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " Mar 13 10:37:05.474278 master-0 kubenswrapper[7508]: I0313 10:37:05.474200 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config\") pod \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\" (UID: \"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c\") " Mar 13 10:37:05.474713 master-0 kubenswrapper[7508]: I0313 10:37:05.474285 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca" (OuterVolumeSpecName: "client-ca") pod "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" (UID: "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:37:05.475234 master-0 kubenswrapper[7508]: I0313 10:37:05.475210 7508 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.476776 master-0 kubenswrapper[7508]: I0313 10:37:05.476727 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" (UID: "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:37:05.477747 master-0 kubenswrapper[7508]: I0313 10:37:05.477709 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config" (OuterVolumeSpecName: "config") pod "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" (UID: "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:37:05.480868 master-0 kubenswrapper[7508]: I0313 10:37:05.480822 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt" (OuterVolumeSpecName: "kube-api-access-hjbpt") pod "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" (UID: "7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c"). InnerVolumeSpecName "kube-api-access-hjbpt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:05.576870 master-0 kubenswrapper[7508]: I0313 10:37:05.576813 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjbpt\" (UniqueName: \"kubernetes.io/projected/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-kube-api-access-hjbpt\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.576870 master-0 kubenswrapper[7508]: I0313 10:37:05.576865 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.576870 master-0 kubenswrapper[7508]: I0313 10:37:05.576881 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:05.649899 master-0 kubenswrapper[7508]: I0313 10:37:05.649849 7508 generic.go:334] "Generic (PLEG): container finished" podID="018c9219-d314-4408-ac39-93475d87eefb" containerID="74ae020ca7669fb01b80f8f98f454493cc6cfee0df109ea9dc9a0bb83ef979da" exitCode=0 Mar 13 10:37:05.650994 master-0 kubenswrapper[7508]: I0313 10:37:05.649966 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerDied","Data":"74ae020ca7669fb01b80f8f98f454493cc6cfee0df109ea9dc9a0bb83ef979da"} Mar 13 10:37:05.655743 master-0 kubenswrapper[7508]: I0313 10:37:05.655701 7508 generic.go:334] "Generic (PLEG): container finished" podID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" containerID="416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050" exitCode=0 Mar 13 10:37:05.655851 master-0 kubenswrapper[7508]: I0313 10:37:05.655767 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" event={"ID":"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c","Type":"ContainerDied","Data":"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050"} Mar 13 10:37:05.655851 master-0 kubenswrapper[7508]: I0313 10:37:05.655793 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" event={"ID":"7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c","Type":"ContainerDied","Data":"e10b8dc5da77f02b966d2a4bc5f393d98d92a2cd51a02f0fa6d43be469f40a62"} Mar 13 10:37:05.655851 master-0 kubenswrapper[7508]: I0313 10:37:05.655811 7508 scope.go:117] "RemoveContainer" containerID="416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050" Mar 13 10:37:05.655963 master-0 kubenswrapper[7508]: I0313 10:37:05.655905 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk" Mar 13 10:37:05.666756 master-0 kubenswrapper[7508]: I0313 10:37:05.666721 7508 generic.go:334] "Generic (PLEG): container finished" podID="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" containerID="4bdeab3ebfebb7845458ea9c29cbf7443ef96922911395dc3575274a6c5d9316" exitCode=0 Mar 13 10:37:05.666846 master-0 kubenswrapper[7508]: I0313 10:37:05.666781 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerDied","Data":"4bdeab3ebfebb7845458ea9c29cbf7443ef96922911395dc3575274a6c5d9316"} Mar 13 10:37:05.693015 master-0 kubenswrapper[7508]: I0313 10:37:05.691102 7508 generic.go:334] "Generic (PLEG): container finished" podID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerID="f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f" exitCode=0 Mar 13 10:37:05.693015 master-0 kubenswrapper[7508]: I0313 10:37:05.691508 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhll" event={"ID":"1b57fa2d-b65e-4c69-97ce-4a379470d2de","Type":"ContainerDied","Data":"f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f"} Mar 13 10:37:05.693015 master-0 kubenswrapper[7508]: I0313 10:37:05.691638 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhll" event={"ID":"1b57fa2d-b65e-4c69-97ce-4a379470d2de","Type":"ContainerStarted","Data":"11c2c30e0a3283728cc95581ddad85c479ad7dd17380bf08f4e4ddbd96d47244"} Mar 13 10:37:05.698587 master-0 kubenswrapper[7508]: I0313 10:37:05.698390 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" event={"ID":"a855e43e-f243-4397-a92f-60285f679eee","Type":"ContainerStarted","Data":"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211"} Mar 13 10:37:05.698587 master-0 kubenswrapper[7508]: I0313 10:37:05.698564 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" podUID="a855e43e-f243-4397-a92f-60285f679eee" containerName="controller-manager" containerID="cri-o://0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211" gracePeriod=30 Mar 13 10:37:05.701942 master-0 kubenswrapper[7508]: I0313 10:37:05.701719 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:37:05.703021 master-0 kubenswrapper[7508]: I0313 10:37:05.702987 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"046ee36d-4062-4c48-bab0-57381613b2ad","Type":"ContainerStarted","Data":"d95bfc7a6e7ed39f46a268281c89d6bfb2a11e840e00f0153aa1d414147f5319"} Mar 13 10:37:05.704196 master-0 kubenswrapper[7508]: I0313 10:37:05.704172 7508 scope.go:117] "RemoveContainer" containerID="416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050" Mar 13 10:37:05.704565 master-0 kubenswrapper[7508]: E0313 10:37:05.704526 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050\": container with ID starting with 416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050 not found: ID does not exist" containerID="416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050" Mar 13 10:37:05.704643 master-0 kubenswrapper[7508]: I0313 10:37:05.704577 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050"} err="failed to get container status \"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050\": rpc error: code = NotFound desc = could not find container \"416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050\": container with ID starting with 416a527cc68dc0fb3e14f50e73aee93b02d1228216015c3a6ee3410aded60050 not found: ID does not exist" Mar 13 10:37:05.706119 master-0 kubenswrapper[7508]: I0313 10:37:05.706009 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:37:05.706799 master-0 kubenswrapper[7508]: I0313 10:37:05.706722 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_5b8b68d8-a452-45f4-aaad-3d91cfb3e298/installer/0.log" Mar 13 10:37:05.706799 master-0 kubenswrapper[7508]: I0313 10:37:05.706760 7508 generic.go:334] "Generic (PLEG): container finished" podID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" containerID="9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed" exitCode=1 Mar 13 10:37:05.706799 master-0 kubenswrapper[7508]: I0313 10:37:05.706788 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5b8b68d8-a452-45f4-aaad-3d91cfb3e298","Type":"ContainerDied","Data":"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed"} Mar 13 10:37:05.706963 master-0 kubenswrapper[7508]: I0313 10:37:05.706811 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"5b8b68d8-a452-45f4-aaad-3d91cfb3e298","Type":"ContainerDied","Data":"80b93464f895f412d46c60dec5c28dc3ca977c744f86bc0ec48b9c7d41093db2"} Mar 13 10:37:05.706963 master-0 kubenswrapper[7508]: I0313 10:37:05.706887 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 13 10:37:05.707408 master-0 kubenswrapper[7508]: I0313 10:37:05.707388 7508 scope.go:117] "RemoveContainer" containerID="9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed" Mar 13 10:37:05.719016 master-0 kubenswrapper[7508]: I0313 10:37:05.718777 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:37:05.733711 master-0 kubenswrapper[7508]: I0313 10:37:05.733658 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c67dff5b-jmqzk"] Mar 13 10:37:05.744208 master-0 kubenswrapper[7508]: I0313 10:37:05.744147 7508 scope.go:117] "RemoveContainer" containerID="9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed" Mar 13 10:37:05.746760 master-0 kubenswrapper[7508]: E0313 10:37:05.746706 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed\": container with ID starting with 9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed not found: ID does not exist" containerID="9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed" Mar 13 10:37:05.746851 master-0 kubenswrapper[7508]: I0313 10:37:05.746761 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed"} err="failed to get container status \"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed\": rpc error: code = NotFound desc = could not find container \"9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed\": container with ID starting with 9e43015549be83a71fed76ecb68efa0cca510990c7a000e4903da98c1ff8b7ed not found: ID does not exist" Mar 13 10:37:05.757294 master-0 kubenswrapper[7508]: I0313 10:37:05.757007 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:37:05.763391 master-0 kubenswrapper[7508]: I0313 10:37:05.763286 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 13 10:37:05.766476 master-0 kubenswrapper[7508]: I0313 10:37:05.766390 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:37:05.768939 master-0 kubenswrapper[7508]: I0313 10:37:05.766731 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" containerID="cri-o://6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8" gracePeriod=30 Mar 13 10:37:05.781344 master-0 kubenswrapper[7508]: I0313 10:37:05.779986 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" podStartSLOduration=17.384873708 podStartE2EDuration="28.779959418s" podCreationTimestamp="2026-03-13 10:36:37 +0000 UTC" firstStartedPulling="2026-03-13 10:36:53.243643804 +0000 UTC m=+51.986468921" lastFinishedPulling="2026-03-13 10:37:04.638729504 +0000 UTC m=+63.381554631" observedRunningTime="2026-03-13 10:37:05.777543436 +0000 UTC m=+64.520368553" watchObservedRunningTime="2026-03-13 10:37:05.779959418 +0000 UTC m=+64.522784535" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.851515 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: E0313 10:37:05.851702 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" containerName="installer" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.851713 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" containerName="installer" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: E0313 10:37:05.851728 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" containerName="route-controller-manager" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.851735 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" containerName="route-controller-manager" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.851827 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" containerName="route-controller-manager" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.851839 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" containerName="installer" Mar 13 10:37:05.854200 master-0 kubenswrapper[7508]: I0313 10:37:05.852211 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.865843 master-0 kubenswrapper[7508]: I0313 10:37:05.855452 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:37:05.865843 master-0 kubenswrapper[7508]: I0313 10:37:05.855579 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:37:05.865843 master-0 kubenswrapper[7508]: I0313 10:37:05.855671 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:37:05.865843 master-0 kubenswrapper[7508]: I0313 10:37:05.855878 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:37:05.865843 master-0 kubenswrapper[7508]: I0313 10:37:05.855906 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:37:05.881292 master-0 kubenswrapper[7508]: I0313 10:37:05.881087 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:37:05.885294 master-0 kubenswrapper[7508]: I0313 10:37:05.883237 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.885294 master-0 kubenswrapper[7508]: I0313 10:37:05.883324 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.885294 master-0 kubenswrapper[7508]: I0313 10:37:05.883360 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.885294 master-0 kubenswrapper[7508]: I0313 10:37:05.883402 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.986145 master-0 kubenswrapper[7508]: I0313 10:37:05.985037 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.986145 master-0 kubenswrapper[7508]: I0313 10:37:05.985165 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.986145 master-0 kubenswrapper[7508]: I0313 10:37:05.985201 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.986145 master-0 kubenswrapper[7508]: I0313 10:37:05.985243 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.986479 master-0 kubenswrapper[7508]: I0313 10:37:05.986288 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.987710 master-0 kubenswrapper[7508]: I0313 10:37:05.987607 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:05.990637 master-0 kubenswrapper[7508]: I0313 10:37:05.990542 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:06.002470 master-0 kubenswrapper[7508]: I0313 10:37:06.002402 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:06.183688 master-0 kubenswrapper[7508]: I0313 10:37:06.183648 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:37:06.186916 master-0 kubenswrapper[7508]: I0313 10:37:06.186876 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles\") pod \"a855e43e-f243-4397-a92f-60285f679eee\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " Mar 13 10:37:06.186987 master-0 kubenswrapper[7508]: I0313 10:37:06.186960 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:06.187403 master-0 kubenswrapper[7508]: I0313 10:37:06.186965 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert\") pod \"a855e43e-f243-4397-a92f-60285f679eee\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " Mar 13 10:37:06.187467 master-0 kubenswrapper[7508]: I0313 10:37:06.187450 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca\") pod \"a855e43e-f243-4397-a92f-60285f679eee\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " Mar 13 10:37:06.187503 master-0 kubenswrapper[7508]: I0313 10:37:06.187481 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9szlb\" (UniqueName: \"kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb\") pod \"a855e43e-f243-4397-a92f-60285f679eee\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " Mar 13 10:37:06.187559 master-0 kubenswrapper[7508]: I0313 10:37:06.187536 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config\") pod \"a855e43e-f243-4397-a92f-60285f679eee\" (UID: \"a855e43e-f243-4397-a92f-60285f679eee\") " Mar 13 10:37:06.188246 master-0 kubenswrapper[7508]: I0313 10:37:06.188136 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a855e43e-f243-4397-a92f-60285f679eee" (UID: "a855e43e-f243-4397-a92f-60285f679eee"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:37:06.188246 master-0 kubenswrapper[7508]: I0313 10:37:06.188189 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config" (OuterVolumeSpecName: "config") pod "a855e43e-f243-4397-a92f-60285f679eee" (UID: "a855e43e-f243-4397-a92f-60285f679eee"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:37:06.188351 master-0 kubenswrapper[7508]: I0313 10:37:06.188259 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca" (OuterVolumeSpecName: "client-ca") pod "a855e43e-f243-4397-a92f-60285f679eee" (UID: "a855e43e-f243-4397-a92f-60285f679eee"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:37:06.224617 master-0 kubenswrapper[7508]: I0313 10:37:06.218343 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a855e43e-f243-4397-a92f-60285f679eee" (UID: "a855e43e-f243-4397-a92f-60285f679eee"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:37:06.224617 master-0 kubenswrapper[7508]: I0313 10:37:06.218752 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb" (OuterVolumeSpecName: "kube-api-access-9szlb") pod "a855e43e-f243-4397-a92f-60285f679eee" (UID: "a855e43e-f243-4397-a92f-60285f679eee"). InnerVolumeSpecName "kube-api-access-9szlb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:06.288416 master-0 kubenswrapper[7508]: I0313 10:37:06.288354 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:06.288416 master-0 kubenswrapper[7508]: I0313 10:37:06.288414 7508 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:06.288573 master-0 kubenswrapper[7508]: I0313 10:37:06.288431 7508 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a855e43e-f243-4397-a92f-60285f679eee-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:06.288573 master-0 kubenswrapper[7508]: I0313 10:37:06.288446 7508 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a855e43e-f243-4397-a92f-60285f679eee-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:06.288573 master-0 kubenswrapper[7508]: I0313 10:37:06.288463 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9szlb\" (UniqueName: \"kubernetes.io/projected/a855e43e-f243-4397-a92f-60285f679eee-kube-api-access-9szlb\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:06.537247 master-0 kubenswrapper[7508]: I0313 10:37:06.536326 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 10:37:06.537247 master-0 kubenswrapper[7508]: E0313 10:37:06.536593 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a855e43e-f243-4397-a92f-60285f679eee" containerName="controller-manager" Mar 13 10:37:06.537247 master-0 kubenswrapper[7508]: I0313 10:37:06.536612 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="a855e43e-f243-4397-a92f-60285f679eee" containerName="controller-manager" Mar 13 10:37:06.537247 master-0 kubenswrapper[7508]: I0313 10:37:06.536721 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="a855e43e-f243-4397-a92f-60285f679eee" containerName="controller-manager" Mar 13 10:37:06.537247 master-0 kubenswrapper[7508]: I0313 10:37:06.537112 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.550211 master-0 kubenswrapper[7508]: I0313 10:37:06.548834 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 10:37:06.591912 master-0 kubenswrapper[7508]: I0313 10:37:06.591816 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.591912 master-0 kubenswrapper[7508]: I0313 10:37:06.591927 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.592265 master-0 kubenswrapper[7508]: I0313 10:37:06.591953 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.692346 master-0 kubenswrapper[7508]: I0313 10:37:06.692277 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.692593 master-0 kubenswrapper[7508]: I0313 10:37:06.692365 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.692593 master-0 kubenswrapper[7508]: I0313 10:37:06.692384 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.692593 master-0 kubenswrapper[7508]: I0313 10:37:06.692480 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.692803 master-0 kubenswrapper[7508]: I0313 10:37:06.692749 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.696561 master-0 kubenswrapper[7508]: I0313 10:37:06.696515 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg"] Mar 13 10:37:06.697208 master-0 kubenswrapper[7508]: I0313 10:37:06.697175 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:06.699884 master-0 kubenswrapper[7508]: I0313 10:37:06.699856 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 10:37:06.700303 master-0 kubenswrapper[7508]: I0313 10:37:06.700281 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 10:37:06.700476 master-0 kubenswrapper[7508]: I0313 10:37:06.700454 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 10:37:06.719598 master-0 kubenswrapper[7508]: I0313 10:37:06.719562 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.829667 master-0 kubenswrapper[7508]: I0313 10:37:06.829508 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:06.850154 master-0 kubenswrapper[7508]: I0313 10:37:06.849952 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:37:06.856665 master-0 kubenswrapper[7508]: I0313 10:37:06.856615 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:06.864016 master-0 kubenswrapper[7508]: I0313 10:37:06.863974 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"5147fab22d0f195a0d02e6752ac479b6a2eb3fafa582ebefd6e564393cc0c1fe"} Mar 13 10:37:06.864016 master-0 kubenswrapper[7508]: I0313 10:37:06.864009 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"572ea33856cce4705673f267dd6cdd4075e17161bfb7a4a9a4a7bdfe53ae4cca"} Mar 13 10:37:06.866522 master-0 kubenswrapper[7508]: I0313 10:37:06.866489 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerStarted","Data":"1c2d1da70477f9212bbcb8a5fb61059a816621fa16583f02d7521a05fdef2147"} Mar 13 10:37:06.870240 master-0 kubenswrapper[7508]: I0313 10:37:06.870212 7508 generic.go:334] "Generic (PLEG): container finished" podID="a855e43e-f243-4397-a92f-60285f679eee" containerID="0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211" exitCode=0 Mar 13 10:37:06.870299 master-0 kubenswrapper[7508]: I0313 10:37:06.870284 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" event={"ID":"a855e43e-f243-4397-a92f-60285f679eee","Type":"ContainerDied","Data":"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211"} Mar 13 10:37:06.870340 master-0 kubenswrapper[7508]: I0313 10:37:06.870303 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" event={"ID":"a855e43e-f243-4397-a92f-60285f679eee","Type":"ContainerDied","Data":"942ce849bf198647449825e7f0abf9526ab36c0255252e914e77dd2cc7aba0c5"} Mar 13 10:37:06.870368 master-0 kubenswrapper[7508]: I0313 10:37:06.870342 7508 scope.go:117] "RemoveContainer" containerID="0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211" Mar 13 10:37:06.870460 master-0 kubenswrapper[7508]: I0313 10:37:06.870433 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5997c88d95-94gwc" Mar 13 10:37:06.881644 master-0 kubenswrapper[7508]: I0313 10:37:06.881604 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"046ee36d-4062-4c48-bab0-57381613b2ad","Type":"ContainerStarted","Data":"a4f70fa035c3abd6f5af326fa3fcfdfa3c4b57e2f6aae8e90bf89bf8fa6d8b52"} Mar 13 10:37:06.909446 master-0 kubenswrapper[7508]: I0313 10:37:06.898526 7508 scope.go:117] "RemoveContainer" containerID="0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211" Mar 13 10:37:06.909446 master-0 kubenswrapper[7508]: E0313 10:37:06.902196 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211\": container with ID starting with 0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211 not found: ID does not exist" containerID="0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211" Mar 13 10:37:06.909446 master-0 kubenswrapper[7508]: I0313 10:37:06.902226 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211"} err="failed to get container status \"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211\": rpc error: code = NotFound desc = could not find container \"0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211\": container with ID starting with 0aac8a2a99fed0f880230d17542f9c9cc000fd13635a84a19032f171c850b211 not found: ID does not exist" Mar 13 10:37:06.959529 master-0 kubenswrapper[7508]: I0313 10:37:06.959472 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:06.959529 master-0 kubenswrapper[7508]: I0313 10:37:06.959529 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:06.959901 master-0 kubenswrapper[7508]: I0313 10:37:06.959582 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:06.959901 master-0 kubenswrapper[7508]: I0313 10:37:06.959606 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:06.959901 master-0 kubenswrapper[7508]: I0313 10:37:06.959647 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p29b\" (UniqueName: \"kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:06.959901 master-0 kubenswrapper[7508]: I0313 10:37:06.959699 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:06.959901 master-0 kubenswrapper[7508]: I0313 10:37:06.959728 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.060717 master-0 kubenswrapper[7508]: I0313 10:37:07.060642 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.060717 master-0 kubenswrapper[7508]: I0313 10:37:07.060713 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.060987 master-0 kubenswrapper[7508]: I0313 10:37:07.060742 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.060987 master-0 kubenswrapper[7508]: I0313 10:37:07.060763 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.060987 master-0 kubenswrapper[7508]: I0313 10:37:07.060790 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p29b\" (UniqueName: \"kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:07.060987 master-0 kubenswrapper[7508]: I0313 10:37:07.060815 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:07.060987 master-0 kubenswrapper[7508]: I0313 10:37:07.060837 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.062445 master-0 kubenswrapper[7508]: I0313 10:37:07.062411 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.064008 master-0 kubenswrapper[7508]: I0313 10:37:07.063969 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.064591 master-0 kubenswrapper[7508]: I0313 10:37:07.064560 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.067497 master-0 kubenswrapper[7508]: I0313 10:37:07.067442 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.070852 master-0 kubenswrapper[7508]: I0313 10:37:07.070817 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:07.187967 master-0 kubenswrapper[7508]: I0313 10:37:07.187896 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.394154 master-0 kubenswrapper[7508]: I0313 10:37:07.393621 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:37:07.394154 master-0 kubenswrapper[7508]: I0313 10:37:07.393761 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:37:07.394490 master-0 kubenswrapper[7508]: I0313 10:37:07.394366 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:07.586391 master-0 kubenswrapper[7508]: I0313 10:37:07.580423 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p29b\" (UniqueName: \"kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:07.627742 master-0 kubenswrapper[7508]: I0313 10:37:07.627345 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b8b68d8-a452-45f4-aaad-3d91cfb3e298" path="/var/lib/kubelet/pods/5b8b68d8-a452-45f4-aaad-3d91cfb3e298/volumes" Mar 13 10:37:07.628372 master-0 kubenswrapper[7508]: I0313 10:37:07.628010 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c" path="/var/lib/kubelet/pods/7f77e35d-f18f-4c0f-9e52-a651b7a8ca4c/volumes" Mar 13 10:37:07.628685 master-0 kubenswrapper[7508]: I0313 10:37:07.628643 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:37:07.628685 master-0 kubenswrapper[7508]: I0313 10:37:07.628677 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg"] Mar 13 10:37:07.636594 master-0 kubenswrapper[7508]: I0313 10:37:07.636299 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:37:07.643312 master-0 kubenswrapper[7508]: I0313 10:37:07.638575 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" podStartSLOduration=29.638555674 podStartE2EDuration="29.638555674s" podCreationTimestamp="2026-03-13 10:36:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:07.555803325 +0000 UTC m=+66.298628452" watchObservedRunningTime="2026-03-13 10:37:07.638555674 +0000 UTC m=+66.381380791" Mar 13 10:37:07.749720 master-0 kubenswrapper[7508]: I0313 10:37:07.744481 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=8.744373394 podStartE2EDuration="8.744373394s" podCreationTimestamp="2026-03-13 10:36:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:07.676647517 +0000 UTC m=+66.419472634" watchObservedRunningTime="2026-03-13 10:37:07.744373394 +0000 UTC m=+66.487198531" Mar 13 10:37:07.749720 master-0 kubenswrapper[7508]: I0313 10:37:07.747397 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" podStartSLOduration=15.394633846 podStartE2EDuration="26.747385102s" podCreationTimestamp="2026-03-13 10:36:41 +0000 UTC" firstStartedPulling="2026-03-13 10:36:53.240335188 +0000 UTC m=+51.983160295" lastFinishedPulling="2026-03-13 10:37:04.593086434 +0000 UTC m=+63.335911551" observedRunningTime="2026-03-13 10:37:07.711481266 +0000 UTC m=+66.454306383" watchObservedRunningTime="2026-03-13 10:37:07.747385102 +0000 UTC m=+66.490210219" Mar 13 10:37:07.767935 master-0 kubenswrapper[7508]: I0313 10:37:07.767864 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:37:07.781897 master-0 kubenswrapper[7508]: I0313 10:37:07.781597 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5997c88d95-94gwc"] Mar 13 10:37:07.782201 master-0 kubenswrapper[7508]: I0313 10:37:07.781960 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:37:07.807882 master-0 kubenswrapper[7508]: I0313 10:37:07.807844 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:37:07.893351 master-0 kubenswrapper[7508]: I0313 10:37:07.893294 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerStarted","Data":"c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b"} Mar 13 10:37:07.893470 master-0 kubenswrapper[7508]: I0313 10:37:07.893380 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerStarted","Data":"7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd"} Mar 13 10:37:07.894822 master-0 kubenswrapper[7508]: I0313 10:37:07.894678 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:07.903643 master-0 kubenswrapper[7508]: I0313 10:37:07.903617 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:37:08.176131 master-0 kubenswrapper[7508]: I0313 10:37:08.175443 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 10:37:08.176131 master-0 kubenswrapper[7508]: I0313 10:37:08.176066 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.196492 master-0 kubenswrapper[7508]: I0313 10:37:08.196390 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 13 10:37:08.204947 master-0 kubenswrapper[7508]: I0313 10:37:08.204843 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" podStartSLOduration=4.204820023 podStartE2EDuration="4.204820023s" podCreationTimestamp="2026-03-13 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:08.202846391 +0000 UTC m=+66.945671508" watchObservedRunningTime="2026-03-13 10:37:08.204820023 +0000 UTC m=+66.947645130" Mar 13 10:37:08.207456 master-0 kubenswrapper[7508]: W0313 10:37:08.207412 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6b488263_6a56_439c_945e_926936ed049d.slice/crio-f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a WatchSource:0}: Error finding container f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a: Status 404 returned error can't find the container with id f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a Mar 13 10:37:08.210015 master-0 kubenswrapper[7508]: I0313 10:37:08.209812 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 10:37:08.242979 master-0 kubenswrapper[7508]: I0313 10:37:08.242332 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:37:08.262056 master-0 kubenswrapper[7508]: I0313 10:37:08.262010 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.262154 master-0 kubenswrapper[7508]: I0313 10:37:08.262059 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.262154 master-0 kubenswrapper[7508]: I0313 10:37:08.262143 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.275189 master-0 kubenswrapper[7508]: I0313 10:37:08.275143 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg"] Mar 13 10:37:08.463175 master-0 kubenswrapper[7508]: I0313 10:37:08.460029 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.463175 master-0 kubenswrapper[7508]: I0313 10:37:08.460116 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.463175 master-0 kubenswrapper[7508]: I0313 10:37:08.460189 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.463175 master-0 kubenswrapper[7508]: I0313 10:37:08.460336 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.463175 master-0 kubenswrapper[7508]: I0313 10:37:08.460713 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.475127 master-0 kubenswrapper[7508]: I0313 10:37:08.467749 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:37:08.484889 master-0 kubenswrapper[7508]: I0313 10:37:08.484839 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access\") pod \"installer-2-master-0\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.496410 master-0 kubenswrapper[7508]: I0313 10:37:08.496351 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:37:08.922399 master-0 kubenswrapper[7508]: I0313 10:37:08.921945 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerStarted","Data":"5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8"} Mar 13 10:37:08.947613 master-0 kubenswrapper[7508]: I0313 10:37:08.947538 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"b090a7b841b2284b4a367b1fe9eb531751b92400aca909b51b87e9d7691a206c"} Mar 13 10:37:08.951320 master-0 kubenswrapper[7508]: I0313 10:37:08.950987 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerStarted","Data":"f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a"} Mar 13 10:37:09.859889 master-0 kubenswrapper[7508]: I0313 10:37:09.852087 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a855e43e-f243-4397-a92f-60285f679eee" path="/var/lib/kubelet/pods/a855e43e-f243-4397-a92f-60285f679eee/volumes" Mar 13 10:37:09.967133 master-0 kubenswrapper[7508]: I0313 10:37:09.966196 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerStarted","Data":"34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104"} Mar 13 10:37:10.436926 master-0 kubenswrapper[7508]: I0313 10:37:10.417318 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 13 10:37:10.652543 master-0 kubenswrapper[7508]: I0313 10:37:10.652436 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8"] Mar 13 10:37:10.654433 master-0 kubenswrapper[7508]: I0313 10:37:10.653911 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.659211 master-0 kubenswrapper[7508]: I0313 10:37:10.659146 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:37:10.667637 master-0 kubenswrapper[7508]: I0313 10:37:10.667582 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:37:10.667884 master-0 kubenswrapper[7508]: I0313 10:37:10.667665 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:37:10.667884 master-0 kubenswrapper[7508]: I0313 10:37:10.667665 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:37:10.667884 master-0 kubenswrapper[7508]: I0313 10:37:10.667854 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:37:10.691758 master-0 kubenswrapper[7508]: I0313 10:37:10.691681 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.691934 master-0 kubenswrapper[7508]: I0313 10:37:10.691770 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.691934 master-0 kubenswrapper[7508]: I0313 10:37:10.691810 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.691934 master-0 kubenswrapper[7508]: I0313 10:37:10.691840 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc2r2\" (UniqueName: \"kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.796603 master-0 kubenswrapper[7508]: I0313 10:37:10.792168 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.796603 master-0 kubenswrapper[7508]: I0313 10:37:10.792205 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc2r2\" (UniqueName: \"kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.796603 master-0 kubenswrapper[7508]: I0313 10:37:10.792250 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.796603 master-0 kubenswrapper[7508]: I0313 10:37:10.792286 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.803465 master-0 kubenswrapper[7508]: I0313 10:37:10.803402 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.803790 master-0 kubenswrapper[7508]: I0313 10:37:10.803714 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.834004 master-0 kubenswrapper[7508]: I0313 10:37:10.830951 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:10.943503 master-0 kubenswrapper[7508]: I0313 10:37:10.943188 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:10.943606 master-0 kubenswrapper[7508]: I0313 10:37:10.943567 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:10.951571 master-0 kubenswrapper[7508]: I0313 10:37:10.951392 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_e1a3cdd6-88be-4a7f-955c-2f0b22082e82/installer/0.log" Mar 13 10:37:10.951571 master-0 kubenswrapper[7508]: I0313 10:37:10.951473 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:37:10.966061 master-0 kubenswrapper[7508]: I0313 10:37:10.966001 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerStarted","Data":"cfc30e3ed734f4cb74033d3d0ab50e918052fd74c62e5f4931d21fcdfbcbd074"} Mar 13 10:37:10.968555 master-0 kubenswrapper[7508]: I0313 10:37:10.968511 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_e1a3cdd6-88be-4a7f-955c-2f0b22082e82/installer/0.log" Mar 13 10:37:10.968880 master-0 kubenswrapper[7508]: I0313 10:37:10.968555 7508 generic.go:334] "Generic (PLEG): container finished" podID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" containerID="a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0" exitCode=1 Mar 13 10:37:10.968880 master-0 kubenswrapper[7508]: I0313 10:37:10.968607 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e1a3cdd6-88be-4a7f-955c-2f0b22082e82","Type":"ContainerDied","Data":"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0"} Mar 13 10:37:10.968880 master-0 kubenswrapper[7508]: I0313 10:37:10.968630 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"e1a3cdd6-88be-4a7f-955c-2f0b22082e82","Type":"ContainerDied","Data":"98d3834f79a7a852f9b92d014f5509a2a10b0b4a9a2902b60d45ac88cc6cadb6"} Mar 13 10:37:10.968880 master-0 kubenswrapper[7508]: I0313 10:37:10.968674 7508 scope.go:117] "RemoveContainer" containerID="a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0" Mar 13 10:37:10.968880 master-0 kubenswrapper[7508]: I0313 10:37:10.968773 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 13 10:37:10.975812 master-0 kubenswrapper[7508]: I0313 10:37:10.975771 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerStarted","Data":"cc178eff65e9e37dfca64d7638a02200669b20cdded82a2b29fd98ec8a15cc9e"} Mar 13 10:37:10.976409 master-0 kubenswrapper[7508]: I0313 10:37:10.976374 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:10.982198 master-0 kubenswrapper[7508]: I0313 10:37:10.982168 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:37:10.995145 master-0 kubenswrapper[7508]: I0313 10:37:10.995119 7508 scope.go:117] "RemoveContainer" containerID="a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0" Mar 13 10:37:10.995853 master-0 kubenswrapper[7508]: E0313 10:37:10.995795 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0\": container with ID starting with a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0 not found: ID does not exist" containerID="a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0" Mar 13 10:37:10.995962 master-0 kubenswrapper[7508]: I0313 10:37:10.995870 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0"} err="failed to get container status \"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0\": rpc error: code = NotFound desc = could not find container \"a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0\": container with ID starting with a4f2816ce8b39f07dbd3ab6cfa898a3a09420e5aba597691d50442252c3160f0 not found: ID does not exist" Mar 13 10:37:11.095817 master-0 kubenswrapper[7508]: I0313 10:37:11.095770 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access\") pod \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " Mar 13 10:37:11.095817 master-0 kubenswrapper[7508]: I0313 10:37:11.095822 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir\") pod \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " Mar 13 10:37:11.096139 master-0 kubenswrapper[7508]: I0313 10:37:11.095839 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock\") pod \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\" (UID: \"e1a3cdd6-88be-4a7f-955c-2f0b22082e82\") " Mar 13 10:37:11.096139 master-0 kubenswrapper[7508]: I0313 10:37:11.096031 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock" (OuterVolumeSpecName: "var-lock") pod "e1a3cdd6-88be-4a7f-955c-2f0b22082e82" (UID: "e1a3cdd6-88be-4a7f-955c-2f0b22082e82"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:11.096139 master-0 kubenswrapper[7508]: I0313 10:37:11.096013 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e1a3cdd6-88be-4a7f-955c-2f0b22082e82" (UID: "e1a3cdd6-88be-4a7f-955c-2f0b22082e82"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:11.099732 master-0 kubenswrapper[7508]: I0313 10:37:11.099651 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e1a3cdd6-88be-4a7f-955c-2f0b22082e82" (UID: "e1a3cdd6-88be-4a7f-955c-2f0b22082e82"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:11.197037 master-0 kubenswrapper[7508]: I0313 10:37:11.196872 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:11.197037 master-0 kubenswrapper[7508]: I0313 10:37:11.196916 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:11.197037 master-0 kubenswrapper[7508]: I0313 10:37:11.196929 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e1a3cdd6-88be-4a7f-955c-2f0b22082e82-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: I0313 10:37:12.643953 7508 patch_prober.go:28] interesting pod/apiserver-576d4447f8-zqphk container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]log ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]etcd ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/max-in-flight-filter ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/project.openshift.io-projectcache ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/openshift.io-startinformers ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: livez check failed Mar 13 10:37:12.646018 master-0 kubenswrapper[7508]: I0313 10:37:12.644060 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" podUID="018c9219-d314-4408-ac39-93475d87eefb" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:37:12.667121 master-0 kubenswrapper[7508]: I0313 10:37:12.666872 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc2r2\" (UniqueName: \"kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2\") pod \"machine-approver-955fcfb87-pbxm8\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:12.808629 master-0 kubenswrapper[7508]: I0313 10:37:12.808171 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podStartSLOduration=8.808076922 podStartE2EDuration="8.808076922s" podCreationTimestamp="2026-03-13 10:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:12.807610989 +0000 UTC m=+71.550436106" watchObservedRunningTime="2026-03-13 10:37:12.808076922 +0000 UTC m=+71.550902039" Mar 13 10:37:12.900238 master-0 kubenswrapper[7508]: I0313 10:37:12.900109 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:37:12.927156 master-0 kubenswrapper[7508]: I0313 10:37:12.925940 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:37:12.939133 master-0 kubenswrapper[7508]: I0313 10:37:12.939032 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 13 10:37:12.978957 master-0 kubenswrapper[7508]: I0313 10:37:12.978339 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=6.978310922 podStartE2EDuration="6.978310922s" podCreationTimestamp="2026-03-13 10:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:12.963524616 +0000 UTC m=+71.706349743" watchObservedRunningTime="2026-03-13 10:37:12.978310922 +0000 UTC m=+71.721136039" Mar 13 10:37:12.996048 master-0 kubenswrapper[7508]: I0313 10:37:12.995976 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerStarted","Data":"d19b978c1e8101a0212df3b6611d9d31aa1e8b34d80df670a9b5c7dd94abdbf2"} Mar 13 10:37:13.001958 master-0 kubenswrapper[7508]: I0313 10:37:13.001804 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerStarted","Data":"31634e1fa2a526a5eef76adce598a8e242bdd09cd3c5df9b79281ebf5788e31f"} Mar 13 10:37:13.005853 master-0 kubenswrapper[7508]: I0313 10:37:13.005073 7508 generic.go:334] "Generic (PLEG): container finished" podID="6e69683c-59c5-43da-b105-ef2efb2d0a4e" containerID="9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2" exitCode=0 Mar 13 10:37:13.005853 master-0 kubenswrapper[7508]: I0313 10:37:13.005280 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerDied","Data":"9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2"} Mar 13 10:37:13.006083 master-0 kubenswrapper[7508]: I0313 10:37:13.005824 7508 scope.go:117] "RemoveContainer" containerID="9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2" Mar 13 10:37:13.054015 master-0 kubenswrapper[7508]: I0313 10:37:13.053931 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=6.053903993 podStartE2EDuration="6.053903993s" podCreationTimestamp="2026-03-13 10:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:37:13.031694134 +0000 UTC m=+71.774519271" watchObservedRunningTime="2026-03-13 10:37:13.053903993 +0000 UTC m=+71.796729120" Mar 13 10:37:13.057930 master-0 kubenswrapper[7508]: I0313 10:37:13.057881 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:37:13.058673 master-0 kubenswrapper[7508]: I0313 10:37:13.058634 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-1-master-0" podUID="046ee36d-4062-4c48-bab0-57381613b2ad" containerName="installer" containerID="cri-o://a4f70fa035c3abd6f5af326fa3fcfdfa3c4b57e2f6aae8e90bf89bf8fa6d8b52" gracePeriod=30 Mar 13 10:37:13.725541 master-0 kubenswrapper[7508]: I0313 10:37:13.706181 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" path="/var/lib/kubelet/pods/e1a3cdd6-88be-4a7f-955c-2f0b22082e82/volumes" Mar 13 10:37:14.017852 master-0 kubenswrapper[7508]: I0313 10:37:14.017771 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerStarted","Data":"fe42327b95dec5367f541c81b048f39545c2d05c4325d9527175937bbfdf24b4"} Mar 13 10:37:14.037854 master-0 kubenswrapper[7508]: I0313 10:37:14.037782 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg"] Mar 13 10:37:14.038293 master-0 kubenswrapper[7508]: E0313 10:37:14.038081 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" containerName="installer" Mar 13 10:37:14.038293 master-0 kubenswrapper[7508]: I0313 10:37:14.038134 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" containerName="installer" Mar 13 10:37:14.038293 master-0 kubenswrapper[7508]: I0313 10:37:14.038253 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="e1a3cdd6-88be-4a7f-955c-2f0b22082e82" containerName="installer" Mar 13 10:37:14.039403 master-0 kubenswrapper[7508]: I0313 10:37:14.039365 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.039501 master-0 kubenswrapper[7508]: I0313 10:37:14.039432 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerStarted","Data":"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba"} Mar 13 10:37:14.042577 master-0 kubenswrapper[7508]: I0313 10:37:14.042538 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 10:37:14.042799 master-0 kubenswrapper[7508]: I0313 10:37:14.042773 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 10:37:14.044541 master-0 kubenswrapper[7508]: I0313 10:37:14.044350 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 10:37:14.069171 master-0 kubenswrapper[7508]: I0313 10:37:14.068153 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 10:37:14.074975 master-0 kubenswrapper[7508]: I0313 10:37:14.074923 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg"] Mar 13 10:37:14.110713 master-0 kubenswrapper[7508]: I0313 10:37:14.110562 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.110713 master-0 kubenswrapper[7508]: I0313 10:37:14.110616 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvl4j\" (UniqueName: \"kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.110713 master-0 kubenswrapper[7508]: I0313 10:37:14.110637 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.213255 master-0 kubenswrapper[7508]: I0313 10:37:14.212142 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.213255 master-0 kubenswrapper[7508]: I0313 10:37:14.212191 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvl4j\" (UniqueName: \"kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.213255 master-0 kubenswrapper[7508]: I0313 10:37:14.212212 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.214335 master-0 kubenswrapper[7508]: I0313 10:37:14.214298 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.232946 master-0 kubenswrapper[7508]: I0313 10:37:14.232532 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.250283 master-0 kubenswrapper[7508]: I0313 10:37:14.250223 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvl4j\" (UniqueName: \"kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:14.372964 master-0 kubenswrapper[7508]: I0313 10:37:14.372834 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:37:15.385207 master-0 kubenswrapper[7508]: I0313 10:37:15.383918 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg"] Mar 13 10:37:15.944560 master-0 kubenswrapper[7508]: I0313 10:37:15.944513 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:15.950060 master-0 kubenswrapper[7508]: I0313 10:37:15.950016 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:37:16.070876 master-0 kubenswrapper[7508]: I0313 10:37:16.070783 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"82c2160bbc4014a38023fe88cc8ab1055a69a4c32765b0ad1ae3def9ef497d37"} Mar 13 10:37:16.070876 master-0 kubenswrapper[7508]: I0313 10:37:16.070842 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"8988806dc69dce5b61c53cc2845447a33f520244d709f93fdb6f76499aee8916"} Mar 13 10:37:16.092429 master-0 kubenswrapper[7508]: I0313 10:37:16.091809 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl"] Mar 13 10:37:16.100753 master-0 kubenswrapper[7508]: I0313 10:37:16.100698 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.104049 master-0 kubenswrapper[7508]: I0313 10:37:16.104008 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 10:37:16.104727 master-0 kubenswrapper[7508]: I0313 10:37:16.104686 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 10:37:16.107946 master-0 kubenswrapper[7508]: I0313 10:37:16.107897 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 10:37:16.109056 master-0 kubenswrapper[7508]: I0313 10:37:16.108248 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 10:37:16.126179 master-0 kubenswrapper[7508]: I0313 10:37:16.125190 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl"] Mar 13 10:37:16.247389 master-0 kubenswrapper[7508]: I0313 10:37:16.247276 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjkdx\" (UniqueName: \"kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.247643 master-0 kubenswrapper[7508]: I0313 10:37:16.247599 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.247800 master-0 kubenswrapper[7508]: I0313 10:37:16.247766 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.247848 master-0 kubenswrapper[7508]: I0313 10:37:16.247808 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.248061 master-0 kubenswrapper[7508]: I0313 10:37:16.248041 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.349623 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkdx\" (UniqueName: \"kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.349673 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.349887 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.349966 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.350064 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.350906 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.509171 master-0 kubenswrapper[7508]: I0313 10:37:16.351158 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.514401 master-0 kubenswrapper[7508]: I0313 10:37:16.514295 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.528238 master-0 kubenswrapper[7508]: I0313 10:37:16.521466 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.563620 master-0 kubenswrapper[7508]: I0313 10:37:16.563540 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkdx\" (UniqueName: \"kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.733735 master-0 kubenswrapper[7508]: I0313 10:37:16.733678 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:37:16.967132 master-0 kubenswrapper[7508]: I0313 10:37:16.966994 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt"] Mar 13 10:37:16.971136 master-0 kubenswrapper[7508]: I0313 10:37:16.968222 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:16.971136 master-0 kubenswrapper[7508]: I0313 10:37:16.970334 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 10:37:16.971136 master-0 kubenswrapper[7508]: I0313 10:37:16.970479 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 10:37:16.999304 master-0 kubenswrapper[7508]: I0313 10:37:16.999230 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt"] Mar 13 10:37:17.065922 master-0 kubenswrapper[7508]: I0313 10:37:17.065753 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:17.065922 master-0 kubenswrapper[7508]: I0313 10:37:17.065840 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:17.065922 master-0 kubenswrapper[7508]: I0313 10:37:17.065871 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chxxr\" (UniqueName: \"kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.076022 master-0 kubenswrapper[7508]: I0313 10:37:18.074396 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.076022 master-0 kubenswrapper[7508]: I0313 10:37:18.074481 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.076022 master-0 kubenswrapper[7508]: I0313 10:37:18.074510 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxxr\" (UniqueName: \"kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.081148 master-0 kubenswrapper[7508]: I0313 10:37:18.078123 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.085058 master-0 kubenswrapper[7508]: I0313 10:37:18.083432 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:18.583048 master-0 kubenswrapper[7508]: I0313 10:37:18.582940 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 10:37:18.584579 master-0 kubenswrapper[7508]: I0313 10:37:18.584535 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.587853 master-0 kubenswrapper[7508]: I0313 10:37:18.587795 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:37:18.642807 master-0 kubenswrapper[7508]: I0313 10:37:18.642745 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.642807 master-0 kubenswrapper[7508]: I0313 10:37:18.642817 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.643116 master-0 kubenswrapper[7508]: I0313 10:37:18.642903 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.743807 master-0 kubenswrapper[7508]: I0313 10:37:18.743732 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.744006 master-0 kubenswrapper[7508]: I0313 10:37:18.743843 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.744006 master-0 kubenswrapper[7508]: I0313 10:37:18.743878 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.744006 master-0 kubenswrapper[7508]: I0313 10:37:18.743912 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.744166 master-0 kubenswrapper[7508]: I0313 10:37:18.744011 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:18.999255 master-0 kubenswrapper[7508]: I0313 10:37:18.997764 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 10:37:19.057051 master-0 kubenswrapper[7508]: I0313 10:37:19.056204 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl"] Mar 13 10:37:19.061718 master-0 kubenswrapper[7508]: I0313 10:37:19.061643 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxxr\" (UniqueName: \"kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:19.078160 master-0 kubenswrapper[7508]: I0313 10:37:19.077450 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:19.147467 master-0 kubenswrapper[7508]: I0313 10:37:19.146501 7508 generic.go:334] "Generic (PLEG): container finished" podID="893dac15-d6d4-4a1f-988c-59aaf9e63334" containerID="32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0" exitCode=0 Mar 13 10:37:19.147467 master-0 kubenswrapper[7508]: I0313 10:37:19.146571 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerDied","Data":"32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0"} Mar 13 10:37:19.147467 master-0 kubenswrapper[7508]: I0313 10:37:19.147265 7508 scope.go:117] "RemoveContainer" containerID="32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0" Mar 13 10:37:19.240008 master-0 kubenswrapper[7508]: I0313 10:37:19.236742 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2"] Mar 13 10:37:19.240008 master-0 kubenswrapper[7508]: I0313 10:37:19.237998 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:37:19.244355 master-0 kubenswrapper[7508]: I0313 10:37:19.244272 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.259793 master-0 kubenswrapper[7508]: I0313 10:37:19.259312 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 10:37:19.269235 master-0 kubenswrapper[7508]: I0313 10:37:19.267605 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.269235 master-0 kubenswrapper[7508]: I0313 10:37:19.267726 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mnnnp\" (UniqueName: \"kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.293108 master-0 kubenswrapper[7508]: I0313 10:37:19.291055 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-v9x5b"] Mar 13 10:37:19.293108 master-0 kubenswrapper[7508]: I0313 10:37:19.292043 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.300331 master-0 kubenswrapper[7508]: I0313 10:37:19.299443 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 10:37:19.300331 master-0 kubenswrapper[7508]: I0313 10:37:19.299695 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 10:37:19.300331 master-0 kubenswrapper[7508]: I0313 10:37:19.299856 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 10:37:19.300331 master-0 kubenswrapper[7508]: I0313 10:37:19.299960 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-h7hlp" Mar 13 10:37:19.304249 master-0 kubenswrapper[7508]: I0313 10:37:19.304112 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 10:37:19.310273 master-0 kubenswrapper[7508]: I0313 10:37:19.306411 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 10:37:19.313311 master-0 kubenswrapper[7508]: I0313 10:37:19.312894 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2"] Mar 13 10:37:19.414967 master-0 kubenswrapper[7508]: I0313 10:37:19.392351 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-v9x5b"] Mar 13 10:37:19.414967 master-0 kubenswrapper[7508]: I0313 10:37:19.392851 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:37:19.414967 master-0 kubenswrapper[7508]: I0313 10:37:19.399462 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnnnp\" (UniqueName: \"kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.414967 master-0 kubenswrapper[7508]: I0313 10:37:19.399532 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.426122 master-0 kubenswrapper[7508]: I0313 10:37:19.424131 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.480170 master-0 kubenswrapper[7508]: I0313 10:37:19.477580 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j"] Mar 13 10:37:19.480170 master-0 kubenswrapper[7508]: I0313 10:37:19.478861 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.492521 master-0 kubenswrapper[7508]: I0313 10:37:19.488655 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:37:19.492521 master-0 kubenswrapper[7508]: I0313 10:37:19.488997 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:37:19.492521 master-0 kubenswrapper[7508]: I0313 10:37:19.489180 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:37:19.492521 master-0 kubenswrapper[7508]: I0313 10:37:19.490164 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:37:19.504042 master-0 kubenswrapper[7508]: I0313 10:37:19.503468 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:37:19.504234 master-0 kubenswrapper[7508]: I0313 10:37:19.504149 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.504234 master-0 kubenswrapper[7508]: I0313 10:37:19.504180 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.504234 master-0 kubenswrapper[7508]: I0313 10:37:19.504224 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.504355 master-0 kubenswrapper[7508]: I0313 10:37:19.504268 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8dpd\" (UniqueName: \"kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.504355 master-0 kubenswrapper[7508]: I0313 10:37:19.504302 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.510294 master-0 kubenswrapper[7508]: I0313 10:37:19.508050 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnnnp\" (UniqueName: \"kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:19.530136 master-0 kubenswrapper[7508]: I0313 10:37:19.524817 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671127 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671200 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pb688\" (UniqueName: \"kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671230 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671257 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dpd\" (UniqueName: \"kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671287 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671338 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671368 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671394 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671424 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.673563 master-0 kubenswrapper[7508]: I0313 10:37:19.671446 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.674336 master-0 kubenswrapper[7508]: I0313 10:37:19.674303 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.674433 master-0 kubenswrapper[7508]: I0313 10:37:19.674390 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.674876 master-0 kubenswrapper[7508]: I0313 10:37:19.674839 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.683585 master-0 kubenswrapper[7508]: I0313 10:37:19.683546 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:19.772945 master-0 kubenswrapper[7508]: I0313 10:37:19.772820 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.772945 master-0 kubenswrapper[7508]: I0313 10:37:19.772890 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.773253 master-0 kubenswrapper[7508]: I0313 10:37:19.772960 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.773253 master-0 kubenswrapper[7508]: I0313 10:37:19.772997 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.773253 master-0 kubenswrapper[7508]: I0313 10:37:19.773022 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pb688\" (UniqueName: \"kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.774071 master-0 kubenswrapper[7508]: I0313 10:37:19.774025 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.774384 master-0 kubenswrapper[7508]: I0313 10:37:19.774348 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.774647 master-0 kubenswrapper[7508]: I0313 10:37:19.774559 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.780913 master-0 kubenswrapper[7508]: I0313 10:37:19.780856 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:19.793084 master-0 kubenswrapper[7508]: I0313 10:37:19.793028 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:37:20.193586 master-0 kubenswrapper[7508]: I0313 10:37:20.192940 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j"] Mar 13 10:37:20.202548 master-0 kubenswrapper[7508]: I0313 10:37:20.198841 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.203163 master-0 kubenswrapper[7508]: I0313 10:37:20.202971 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 10:37:20.208918 master-0 kubenswrapper[7508]: I0313 10:37:20.206234 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 10:37:20.208918 master-0 kubenswrapper[7508]: I0313 10:37:20.206579 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 10:37:20.240972 master-0 kubenswrapper[7508]: I0313 10:37:20.239852 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-kqrsd"] Mar 13 10:37:20.244467 master-0 kubenswrapper[7508]: I0313 10:37:20.242525 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:37:20.244467 master-0 kubenswrapper[7508]: I0313 10:37:20.242645 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.270571 master-0 kubenswrapper[7508]: I0313 10:37:20.270527 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pb688\" (UniqueName: \"kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688\") pod \"cluster-cloud-controller-manager-operator-559568b945-vmt9j\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:20.280588 master-0 kubenswrapper[7508]: I0313 10:37:20.278414 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-r9v82" Mar 13 10:37:20.298198 master-0 kubenswrapper[7508]: I0313 10:37:20.294408 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dpd\" (UniqueName: \"kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:20.340204 master-0 kubenswrapper[7508]: I0313 10:37:20.338355 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j"] Mar 13 10:37:20.382657 master-0 kubenswrapper[7508]: I0313 10:37:20.381698 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqrsd"] Mar 13 10:37:20.396209 master-0 kubenswrapper[7508]: I0313 10:37:20.396156 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fcqg\" (UniqueName: \"kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.396209 master-0 kubenswrapper[7508]: I0313 10:37:20.396209 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.396474 master-0 kubenswrapper[7508]: I0313 10:37:20.396261 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.396474 master-0 kubenswrapper[7508]: I0313 10:37:20.396283 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gntlk\" (UniqueName: \"kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.396474 master-0 kubenswrapper[7508]: I0313 10:37:20.396316 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.411614 master-0 kubenswrapper[7508]: I0313 10:37:20.409973 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4"] Mar 13 10:37:20.411614 master-0 kubenswrapper[7508]: I0313 10:37:20.411013 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.418651 master-0 kubenswrapper[7508]: I0313 10:37:20.418596 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 10:37:20.418651 master-0 kubenswrapper[7508]: I0313 10:37:20.418618 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 10:37:20.418885 master-0 kubenswrapper[7508]: I0313 10:37:20.418785 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-t57pn" Mar 13 10:37:20.419246 master-0 kubenswrapper[7508]: I0313 10:37:20.419201 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 10:37:20.419388 master-0 kubenswrapper[7508]: I0313 10:37:20.419361 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 10:37:20.419518 master-0 kubenswrapper[7508]: I0313 10:37:20.419499 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 10:37:20.423342 master-0 kubenswrapper[7508]: I0313 10:37:20.423292 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-kwwkz"] Mar 13 10:37:20.425671 master-0 kubenswrapper[7508]: I0313 10:37:20.425646 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.428328 master-0 kubenswrapper[7508]: I0313 10:37:20.427305 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-d2pmx" Mar 13 10:37:20.432724 master-0 kubenswrapper[7508]: I0313 10:37:20.432137 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:37:20.443264 master-0 kubenswrapper[7508]: I0313 10:37:20.443234 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4"] Mar 13 10:37:20.443583 master-0 kubenswrapper[7508]: I0313 10:37:20.443532 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:37:20.445010 master-0 kubenswrapper[7508]: I0313 10:37:20.444901 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwwkz"] Mar 13 10:37:20.497126 master-0 kubenswrapper[7508]: I0313 10:37:20.497046 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.497402 master-0 kubenswrapper[7508]: I0313 10:37:20.497132 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntlk\" (UniqueName: \"kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.497402 master-0 kubenswrapper[7508]: I0313 10:37:20.497209 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.497402 master-0 kubenswrapper[7508]: I0313 10:37:20.497243 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fcqg\" (UniqueName: \"kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.497402 master-0 kubenswrapper[7508]: I0313 10:37:20.497267 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.497839 master-0 kubenswrapper[7508]: I0313 10:37:20.497816 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.498151 master-0 kubenswrapper[7508]: I0313 10:37:20.498127 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.521684 master-0 kubenswrapper[7508]: I0313 10:37:20.521606 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.534191 master-0 kubenswrapper[7508]: I0313 10:37:20.534123 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fcqg\" (UniqueName: \"kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.544645 master-0 kubenswrapper[7508]: I0313 10:37:20.544598 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntlk\" (UniqueName: \"kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.573931 master-0 kubenswrapper[7508]: I0313 10:37:20.573879 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:37:20.598434 master-0 kubenswrapper[7508]: I0313 10:37:20.598364 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.598449 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.604320 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.604380 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.604408 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vbsc\" (UniqueName: \"kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.604547 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fswp7\" (UniqueName: \"kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.605133 master-0 kubenswrapper[7508]: I0313 10:37:20.604676 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.716483 master-0 kubenswrapper[7508]: I0313 10:37:20.716317 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:37:20.716771 master-0 kubenswrapper[7508]: I0313 10:37:20.716739 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.716879 master-0 kubenswrapper[7508]: I0313 10:37:20.716858 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.716970 master-0 kubenswrapper[7508]: I0313 10:37:20.716958 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.717057 master-0 kubenswrapper[7508]: I0313 10:37:20.717043 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbsc\" (UniqueName: \"kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.717161 master-0 kubenswrapper[7508]: I0313 10:37:20.717148 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswp7\" (UniqueName: \"kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.717258 master-0 kubenswrapper[7508]: I0313 10:37:20.717245 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.717367 master-0 kubenswrapper[7508]: I0313 10:37:20.717352 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.719587 master-0 kubenswrapper[7508]: I0313 10:37:20.718373 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.719587 master-0 kubenswrapper[7508]: I0313 10:37:20.719064 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.719587 master-0 kubenswrapper[7508]: I0313 10:37:20.719216 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.720080 master-0 kubenswrapper[7508]: I0313 10:37:20.720057 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.723985 master-0 kubenswrapper[7508]: I0313 10:37:20.723946 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.747192 master-0 kubenswrapper[7508]: I0313 10:37:20.747132 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswp7\" (UniqueName: \"kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.762957 master-0 kubenswrapper[7508]: I0313 10:37:20.762897 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbsc\" (UniqueName: \"kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:20.791154 master-0 kubenswrapper[7508]: I0313 10:37:20.790923 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:37:20.966128 master-0 kubenswrapper[7508]: I0313 10:37:20.966042 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp"] Mar 13 10:37:20.971988 master-0 kubenswrapper[7508]: I0313 10:37:20.968532 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:20.971988 master-0 kubenswrapper[7508]: I0313 10:37:20.971930 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 10:37:20.972244 master-0 kubenswrapper[7508]: I0313 10:37:20.972222 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 10:37:20.972344 master-0 kubenswrapper[7508]: I0313 10:37:20.972308 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 10:37:20.972396 master-0 kubenswrapper[7508]: I0313 10:37:20.972308 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dvqsb" Mar 13 10:37:21.022222 master-0 kubenswrapper[7508]: I0313 10:37:21.021334 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp"] Mar 13 10:37:21.031135 master-0 kubenswrapper[7508]: I0313 10:37:21.031084 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.031324 master-0 kubenswrapper[7508]: I0313 10:37:21.031154 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrmcp\" (UniqueName: \"kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.031324 master-0 kubenswrapper[7508]: I0313 10:37:21.031192 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.031324 master-0 kubenswrapper[7508]: I0313 10:37:21.031213 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.064426 master-0 kubenswrapper[7508]: I0313 10:37:21.064364 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:37:21.127602 master-0 kubenswrapper[7508]: I0313 10:37:21.127263 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6"] Mar 13 10:37:21.128579 master-0 kubenswrapper[7508]: I0313 10:37:21.128538 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.134926 master-0 kubenswrapper[7508]: I0313 10:37:21.132781 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.134926 master-0 kubenswrapper[7508]: I0313 10:37:21.132878 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmcp\" (UniqueName: \"kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.134926 master-0 kubenswrapper[7508]: I0313 10:37:21.132920 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.134926 master-0 kubenswrapper[7508]: I0313 10:37:21.132950 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.137267 master-0 kubenswrapper[7508]: I0313 10:37:21.137208 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.137267 master-0 kubenswrapper[7508]: I0313 10:37:21.137208 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.138381 master-0 kubenswrapper[7508]: I0313 10:37:21.137845 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 10:37:21.139233 master-0 kubenswrapper[7508]: I0313 10:37:21.139206 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.299370 master-0 kubenswrapper[7508]: I0313 10:37:21.297054 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.299370 master-0 kubenswrapper[7508]: I0313 10:37:21.297165 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffmmr\" (UniqueName: \"kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.299370 master-0 kubenswrapper[7508]: I0313 10:37:21.297210 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.299370 master-0 kubenswrapper[7508]: I0313 10:37:21.297265 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.301733 master-0 kubenswrapper[7508]: I0313 10:37:21.300571 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6"] Mar 13 10:37:21.341160 master-0 kubenswrapper[7508]: I0313 10:37:21.341070 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmcp\" (UniqueName: \"kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:21.398869 master-0 kubenswrapper[7508]: I0313 10:37:21.398790 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmmr\" (UniqueName: \"kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.399069 master-0 kubenswrapper[7508]: I0313 10:37:21.398898 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.399069 master-0 kubenswrapper[7508]: I0313 10:37:21.398952 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.399199 master-0 kubenswrapper[7508]: I0313 10:37:21.399075 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.400254 master-0 kubenswrapper[7508]: I0313 10:37:21.400224 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.412867 master-0 kubenswrapper[7508]: I0313 10:37:21.412818 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.426282 master-0 kubenswrapper[7508]: I0313 10:37:21.426134 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.437192 master-0 kubenswrapper[7508]: I0313 10:37:21.436562 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmmr\" (UniqueName: \"kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.599311 master-0 kubenswrapper[7508]: I0313 10:37:21.556553 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:21.629109 master-0 kubenswrapper[7508]: I0313 10:37:21.627987 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:37:22.445550 master-0 kubenswrapper[7508]: I0313 10:37:22.445492 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:37:22.462350 master-0 kubenswrapper[7508]: I0313 10:37:22.462169 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:37:22.474265 master-0 kubenswrapper[7508]: W0313 10:37:22.474048 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0881de70_2db3_4fc2_b976_b55c11dc239d.slice/crio-51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b WatchSource:0}: Error finding container 51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b: Status 404 returned error can't find the container with id 51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b Mar 13 10:37:22.570157 master-0 kubenswrapper[7508]: I0313 10:37:22.569861 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-lhqzl"] Mar 13 10:37:22.573074 master-0 kubenswrapper[7508]: I0313 10:37:22.573041 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.575181 master-0 kubenswrapper[7508]: I0313 10:37:22.575159 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-fsw7z" Mar 13 10:37:22.582414 master-0 kubenswrapper[7508]: I0313 10:37:22.582375 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhqzl"] Mar 13 10:37:22.765872 master-0 kubenswrapper[7508]: I0313 10:37:22.765733 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhzw"] Mar 13 10:37:22.767002 master-0 kubenswrapper[7508]: I0313 10:37:22.766969 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:22.768761 master-0 kubenswrapper[7508]: I0313 10:37:22.768738 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x4n7x" Mar 13 10:37:22.789358 master-0 kubenswrapper[7508]: I0313 10:37:22.788783 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhzw"] Mar 13 10:37:22.831488 master-0 kubenswrapper[7508]: I0313 10:37:22.831425 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.831488 master-0 kubenswrapper[7508]: I0313 10:37:22.831494 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4kn26\" (UniqueName: \"kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.831788 master-0 kubenswrapper[7508]: I0313 10:37:22.831565 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933039 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933124 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933148 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kn26\" (UniqueName: \"kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933175 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bn7vq\" (UniqueName: \"kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933206 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:22.933284 master-0 kubenswrapper[7508]: I0313 10:37:22.933238 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.934459 master-0 kubenswrapper[7508]: I0313 10:37:22.933739 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.934459 master-0 kubenswrapper[7508]: I0313 10:37:22.934000 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:22.965217 master-0 kubenswrapper[7508]: I0313 10:37:22.954227 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kn26\" (UniqueName: \"kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:23.034809 master-0 kubenswrapper[7508]: I0313 10:37:23.034677 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7vq\" (UniqueName: \"kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.034809 master-0 kubenswrapper[7508]: I0313 10:37:23.034737 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.034809 master-0 kubenswrapper[7508]: I0313 10:37:23.034794 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.036756 master-0 kubenswrapper[7508]: I0313 10:37:23.035273 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.036756 master-0 kubenswrapper[7508]: I0313 10:37:23.035789 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.038943 master-0 kubenswrapper[7508]: I0313 10:37:23.038846 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:37:23.055122 master-0 kubenswrapper[7508]: I0313 10:37:23.055035 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7vq\" (UniqueName: \"kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.087064 master-0 kubenswrapper[7508]: I0313 10:37:23.085875 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:37:23.424086 master-0 kubenswrapper[7508]: I0313 10:37:23.424023 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b"} Mar 13 10:37:23.428005 master-0 kubenswrapper[7508]: I0313 10:37:23.427973 7508 generic.go:334] "Generic (PLEG): container finished" podID="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" containerID="2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba" exitCode=0 Mar 13 10:37:23.428005 master-0 kubenswrapper[7508]: I0313 10:37:23.428006 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerDied","Data":"2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba"} Mar 13 10:37:23.428581 master-0 kubenswrapper[7508]: I0313 10:37:23.428519 7508 scope.go:117] "RemoveContainer" containerID="2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba" Mar 13 10:37:24.561719 master-0 kubenswrapper[7508]: I0313 10:37:24.560816 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:37:25.444187 master-0 kubenswrapper[7508]: I0313 10:37:25.444150 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d04e4749-2b79-49e2-a451-a2733443a913/installer/0.log" Mar 13 10:37:25.444454 master-0 kubenswrapper[7508]: I0313 10:37:25.444413 7508 generic.go:334] "Generic (PLEG): container finished" podID="d04e4749-2b79-49e2-a451-a2733443a913" containerID="6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8" exitCode=1 Mar 13 10:37:25.444610 master-0 kubenswrapper[7508]: I0313 10:37:25.444589 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d04e4749-2b79-49e2-a451-a2733443a913","Type":"ContainerDied","Data":"6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8"} Mar 13 10:37:25.444715 master-0 kubenswrapper[7508]: I0313 10:37:25.444700 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"d04e4749-2b79-49e2-a451-a2733443a913","Type":"ContainerDied","Data":"f1b14a227c8bc8b981f29cfb4546b0b823b1f503f3e6f7c9e6a036205e1e83ce"} Mar 13 10:37:25.444805 master-0 kubenswrapper[7508]: I0313 10:37:25.444788 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b14a227c8bc8b981f29cfb4546b0b823b1f503f3e6f7c9e6a036205e1e83ce" Mar 13 10:37:25.445704 master-0 kubenswrapper[7508]: I0313 10:37:25.445688 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerStarted","Data":"b399c8bc734d16f4c258d0605a39203e9489484fa48d09e79fa8aa138647119c"} Mar 13 10:37:25.689325 master-0 kubenswrapper[7508]: I0313 10:37:25.688975 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_d04e4749-2b79-49e2-a451-a2733443a913/installer/0.log" Mar 13 10:37:25.689325 master-0 kubenswrapper[7508]: I0313 10:37:25.689054 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:37:26.105129 master-0 kubenswrapper[7508]: I0313 10:37:26.096380 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access\") pod \"d04e4749-2b79-49e2-a451-a2733443a913\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " Mar 13 10:37:26.105129 master-0 kubenswrapper[7508]: I0313 10:37:26.096423 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir\") pod \"d04e4749-2b79-49e2-a451-a2733443a913\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " Mar 13 10:37:26.105129 master-0 kubenswrapper[7508]: I0313 10:37:26.096479 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock\") pod \"d04e4749-2b79-49e2-a451-a2733443a913\" (UID: \"d04e4749-2b79-49e2-a451-a2733443a913\") " Mar 13 10:37:26.105129 master-0 kubenswrapper[7508]: I0313 10:37:26.096838 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock" (OuterVolumeSpecName: "var-lock") pod "d04e4749-2b79-49e2-a451-a2733443a913" (UID: "d04e4749-2b79-49e2-a451-a2733443a913"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:26.105129 master-0 kubenswrapper[7508]: I0313 10:37:26.098252 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "d04e4749-2b79-49e2-a451-a2733443a913" (UID: "d04e4749-2b79-49e2-a451-a2733443a913"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:26.113678 master-0 kubenswrapper[7508]: I0313 10:37:26.111877 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "d04e4749-2b79-49e2-a451-a2733443a913" (UID: "d04e4749-2b79-49e2-a451-a2733443a913"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:26.197620 master-0 kubenswrapper[7508]: I0313 10:37:26.197560 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/d04e4749-2b79-49e2-a451-a2733443a913-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:26.197620 master-0 kubenswrapper[7508]: I0313 10:37:26.197621 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:26.197620 master-0 kubenswrapper[7508]: I0313 10:37:26.197633 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/d04e4749-2b79-49e2-a451-a2733443a913-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:26.469171 master-0 kubenswrapper[7508]: I0313 10:37:26.464813 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4"] Mar 13 10:37:26.469934 master-0 kubenswrapper[7508]: W0313 10:37:26.469879 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61427254_6722_4d1a_a96a_dadd24abbe94.slice/crio-19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d WatchSource:0}: Error finding container 19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d: Status 404 returned error can't find the container with id 19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d Mar 13 10:37:26.472642 master-0 kubenswrapper[7508]: I0313 10:37:26.472608 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"a37231e5cc55c1e76147d48ef0838775a990a3f28298bc163b9c8540136b0b87"} Mar 13 10:37:26.487998 master-0 kubenswrapper[7508]: I0313 10:37:26.487856 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerStarted","Data":"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9"} Mar 13 10:37:26.492541 master-0 kubenswrapper[7508]: I0313 10:37:26.492478 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490"} Mar 13 10:37:26.496147 master-0 kubenswrapper[7508]: I0313 10:37:26.496085 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"e541c073a97e968aa996efa485f9023f303d33477bd12a38bf45fb29e057d0dc"} Mar 13 10:37:26.496254 master-0 kubenswrapper[7508]: I0313 10:37:26.496195 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 13 10:37:27.182744 master-0 kubenswrapper[7508]: I0313 10:37:27.182146 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-kqrsd"] Mar 13 10:37:27.184643 master-0 kubenswrapper[7508]: I0313 10:37:27.184028 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2"] Mar 13 10:37:27.507287 master-0 kubenswrapper[7508]: I0313 10:37:27.507226 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"5759216ebfee850b79609783445de8124c370c8bac5b63e2b5f03e38c742e1f0"} Mar 13 10:37:27.507287 master-0 kubenswrapper[7508]: I0313 10:37:27.507279 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"d11003d934637dd1f9b6e8d23feaca1fc18325edb8c1c59e1375d0720a4469cd"} Mar 13 10:37:27.507287 master-0 kubenswrapper[7508]: I0313 10:37:27.507296 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d"} Mar 13 10:37:27.507704 master-0 kubenswrapper[7508]: I0313 10:37:27.507310 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"e95e82ba3152944d5f266f4315ecef6f288f0249fcf6dd92d242f6cd35eb008a"} Mar 13 10:37:28.513891 master-0 kubenswrapper[7508]: I0313 10:37:28.513826 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"9af6352032b6a53c8275f34292597e82151238d6d1e06b053ba0617d04ed63ea"} Mar 13 10:37:28.516297 master-0 kubenswrapper[7508]: I0313 10:37:28.516238 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"3063852d78f813c61c60f480671955bc61c573d347b6da50459bfe7f96b2e4ca"} Mar 13 10:37:28.616973 master-0 kubenswrapper[7508]: I0313 10:37:28.616870 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-dnhzw"] Mar 13 10:37:28.631477 master-0 kubenswrapper[7508]: I0313 10:37:28.631444 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 13 10:37:28.636966 master-0 kubenswrapper[7508]: I0313 10:37:28.634214 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp"] Mar 13 10:37:28.636966 master-0 kubenswrapper[7508]: I0313 10:37:28.636689 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-v9x5b"] Mar 13 10:37:28.641045 master-0 kubenswrapper[7508]: I0313 10:37:28.640988 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-lhqzl"] Mar 13 10:37:28.643936 master-0 kubenswrapper[7508]: I0313 10:37:28.643858 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6"] Mar 13 10:37:28.646924 master-0 kubenswrapper[7508]: I0313 10:37:28.646875 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j"] Mar 13 10:37:28.653566 master-0 kubenswrapper[7508]: I0313 10:37:28.653523 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-kwwkz"] Mar 13 10:37:28.655814 master-0 kubenswrapper[7508]: I0313 10:37:28.655788 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt"] Mar 13 10:37:29.143773 master-0 kubenswrapper[7508]: W0313 10:37:29.143673 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf99b999c_4213_4d29_ab14_26c584e88445.slice/crio-2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2 WatchSource:0}: Error finding container 2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2: Status 404 returned error can't find the container with id 2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2 Mar 13 10:37:29.523650 master-0 kubenswrapper[7508]: I0313 10:37:29.523434 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerStarted","Data":"4f70e184622d577e74124d1d17bc445ea80514437cbc221bcb9f2c6f012aa2ca"} Mar 13 10:37:29.527705 master-0 kubenswrapper[7508]: I0313 10:37:29.527635 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerStarted","Data":"e3e74e8a6d87769b2b8f6bdae5a948fbb44f464be31e39d10a8d9e290f6b63c1"} Mar 13 10:37:29.530725 master-0 kubenswrapper[7508]: I0313 10:37:29.530620 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerStarted","Data":"2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2"} Mar 13 10:37:30.541866 master-0 kubenswrapper[7508]: I0313 10:37:30.541798 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"b6406db9242e3599a9f6b43c6cc7f931a2398c12649757d5a331d9757d32028e"} Mar 13 10:37:31.662150 master-0 kubenswrapper[7508]: I0313 10:37:31.661305 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" podStartSLOduration=9.466524988 podStartE2EDuration="25.661256129s" podCreationTimestamp="2026-03-13 10:37:06 +0000 UTC" firstStartedPulling="2026-03-13 10:37:08.409725737 +0000 UTC m=+67.152550854" lastFinishedPulling="2026-03-13 10:37:24.604456878 +0000 UTC m=+83.347281995" observedRunningTime="2026-03-13 10:37:31.658077606 +0000 UTC m=+90.400902713" watchObservedRunningTime="2026-03-13 10:37:31.661256129 +0000 UTC m=+90.404081256" Mar 13 10:37:32.620239 master-0 kubenswrapper[7508]: I0313 10:37:32.619144 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" podStartSLOduration=12.099091825 podStartE2EDuration="22.619114051s" podCreationTimestamp="2026-03-13 10:37:10 +0000 UTC" firstStartedPulling="2026-03-13 10:37:13.980963383 +0000 UTC m=+72.723788490" lastFinishedPulling="2026-03-13 10:37:24.500985599 +0000 UTC m=+83.243810716" observedRunningTime="2026-03-13 10:37:32.580272708 +0000 UTC m=+91.323097835" watchObservedRunningTime="2026-03-13 10:37:32.619114051 +0000 UTC m=+91.361939178" Mar 13 10:37:32.620239 master-0 kubenswrapper[7508]: I0313 10:37:32.619829 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:37:32.638348 master-0 kubenswrapper[7508]: I0313 10:37:32.637726 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.732981 7508 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733068 7508 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: E0313 10:37:32.733448 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733476 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: E0313 10:37:32.733499 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733505 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: E0313 10:37:32.733519 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733525 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733662 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733679 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:37:32.734119 master-0 kubenswrapper[7508]: I0313 10:37:32.733686 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 13 10:37:32.738134 master-0 kubenswrapper[7508]: I0313 10:37:32.735512 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://fc59a335ab92b5426116aa2f5adb31266760392f014df421d723f95bb6f6ebfb" gracePeriod=30 Mar 13 10:37:32.738134 master-0 kubenswrapper[7508]: I0313 10:37:32.735639 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://2b53706ef774eb15c126f57be58e4c0c9f005142fd0e9af295b43871ae8de7ef" gracePeriod=30 Mar 13 10:37:32.738134 master-0 kubenswrapper[7508]: I0313 10:37:32.737628 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812131 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812205 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812342 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812413 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812481 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.814402 master-0 kubenswrapper[7508]: I0313 10:37:32.812789 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914381 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914450 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914484 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914520 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914555 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914584 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914695 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914763 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914800 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914828 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914860 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:32.919340 master-0 kubenswrapper[7508]: I0313 10:37:32.914902 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:37:33.297556 master-0 kubenswrapper[7508]: W0313 10:37:33.296911 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod257ae542_4a06_42d3_b3e8_bf0a376494a8.slice/crio-8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f WatchSource:0}: Error finding container 8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f: Status 404 returned error can't find the container with id 8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f Mar 13 10:37:33.297556 master-0 kubenswrapper[7508]: W0313 10:37:33.297194 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0917212_59d8_4799_a9bc_52e358c5e8a0.slice/crio-3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858 WatchSource:0}: Error finding container 3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858: Status 404 returned error can't find the container with id 3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858 Mar 13 10:37:33.304777 master-0 kubenswrapper[7508]: W0313 10:37:33.301639 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc834b554_c652_4f45_9110_3d4e260ba98a.slice/crio-14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495 WatchSource:0}: Error finding container 14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495: Status 404 returned error can't find the container with id 14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495 Mar 13 10:37:33.513658 master-0 kubenswrapper[7508]: I0313 10:37:33.513563 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d04e4749-2b79-49e2-a451-a2733443a913" path="/var/lib/kubelet/pods/d04e4749-2b79-49e2-a451-a2733443a913/volumes" Mar 13 10:37:33.645922 master-0 kubenswrapper[7508]: I0313 10:37:33.645845 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerStarted","Data":"edb84f3680f6b7a9122dea49c8ac75c4b3614e7e24eb119b118fbf82de0d5e2c"} Mar 13 10:37:33.649007 master-0 kubenswrapper[7508]: I0313 10:37:33.648935 7508 generic.go:334] "Generic (PLEG): container finished" podID="2157cb66-d458-4353-bc9c-ef761e61e5c5" containerID="9af6352032b6a53c8275f34292597e82151238d6d1e06b053ba0617d04ed63ea" exitCode=0 Mar 13 10:37:33.649164 master-0 kubenswrapper[7508]: I0313 10:37:33.649007 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerDied","Data":"9af6352032b6a53c8275f34292597e82151238d6d1e06b053ba0617d04ed63ea"} Mar 13 10:37:33.650973 master-0 kubenswrapper[7508]: I0313 10:37:33.650932 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerStarted","Data":"14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495"} Mar 13 10:37:33.652617 master-0 kubenswrapper[7508]: I0313 10:37:33.652575 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerStarted","Data":"8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f"} Mar 13 10:37:33.654659 master-0 kubenswrapper[7508]: I0313 10:37:33.654628 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858"} Mar 13 10:37:33.656597 master-0 kubenswrapper[7508]: I0313 10:37:33.656357 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"9fb60bfa59d2ff40288f456815269ff4c838e82195edd334933c8654b4f8dedd"} Mar 13 10:37:33.658595 master-0 kubenswrapper[7508]: I0313 10:37:33.658555 7508 generic.go:334] "Generic (PLEG): container finished" podID="8d2fdba3-9478-4165-9207-d01483625607" containerID="c06a4f7f54577d80872f3a5157b329f2c2ec17e43e599b09564a82e127162989" exitCode=0 Mar 13 10:37:33.658718 master-0 kubenswrapper[7508]: I0313 10:37:33.658602 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerDied","Data":"c06a4f7f54577d80872f3a5157b329f2c2ec17e43e599b09564a82e127162989"} Mar 13 10:37:33.659559 master-0 kubenswrapper[7508]: I0313 10:37:33.659513 7508 scope.go:117] "RemoveContainer" containerID="c06a4f7f54577d80872f3a5157b329f2c2ec17e43e599b09564a82e127162989" Mar 13 10:37:36.680271 master-0 kubenswrapper[7508]: I0313 10:37:36.680175 7508 generic.go:334] "Generic (PLEG): container finished" podID="994d29a3-98d8-45bd-8922-adcdc899b632" containerID="ccc3b2c6e99cb63369120234f78e03c40f7502629397be2489760d94a1bdc974" exitCode=0 Mar 13 10:37:36.680271 master-0 kubenswrapper[7508]: I0313 10:37:36.680230 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerDied","Data":"ccc3b2c6e99cb63369120234f78e03c40f7502629397be2489760d94a1bdc974"} Mar 13 10:37:45.804893 master-0 kubenswrapper[7508]: E0313 10:37:45.804829 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 10:37:45.805575 master-0 kubenswrapper[7508]: I0313 10:37:45.805419 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:37:45.860745 master-0 kubenswrapper[7508]: E0313 10:37:45.860669 7508 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:37:48.845422 master-0 kubenswrapper[7508]: I0313 10:37:48.845179 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 10:37:53.331801 master-0 kubenswrapper[7508]: I0313 10:37:53.331696 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 10:37:53.436475 master-0 kubenswrapper[7508]: I0313 10:37:53.436350 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access\") pod \"994d29a3-98d8-45bd-8922-adcdc899b632\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " Mar 13 10:37:53.436475 master-0 kubenswrapper[7508]: I0313 10:37:53.436498 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir\") pod \"994d29a3-98d8-45bd-8922-adcdc899b632\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " Mar 13 10:37:53.436936 master-0 kubenswrapper[7508]: I0313 10:37:53.436573 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock\") pod \"994d29a3-98d8-45bd-8922-adcdc899b632\" (UID: \"994d29a3-98d8-45bd-8922-adcdc899b632\") " Mar 13 10:37:53.436936 master-0 kubenswrapper[7508]: I0313 10:37:53.436703 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "994d29a3-98d8-45bd-8922-adcdc899b632" (UID: "994d29a3-98d8-45bd-8922-adcdc899b632"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:53.436936 master-0 kubenswrapper[7508]: I0313 10:37:53.436730 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock" (OuterVolumeSpecName: "var-lock") pod "994d29a3-98d8-45bd-8922-adcdc899b632" (UID: "994d29a3-98d8-45bd-8922-adcdc899b632"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:53.437433 master-0 kubenswrapper[7508]: I0313 10:37:53.437374 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:53.437433 master-0 kubenswrapper[7508]: I0313 10:37:53.437423 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/994d29a3-98d8-45bd-8922-adcdc899b632-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:53.445051 master-0 kubenswrapper[7508]: I0313 10:37:53.444982 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "994d29a3-98d8-45bd-8922-adcdc899b632" (UID: "994d29a3-98d8-45bd-8922-adcdc899b632"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:53.539052 master-0 kubenswrapper[7508]: I0313 10:37:53.538980 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/994d29a3-98d8-45bd-8922-adcdc899b632-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:53.649137 master-0 kubenswrapper[7508]: I0313 10:37:53.648979 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 10:37:53.787784 master-0 kubenswrapper[7508]: I0313 10:37:53.787672 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerDied","Data":"86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8"} Mar 13 10:37:53.787784 master-0 kubenswrapper[7508]: I0313 10:37:53.787713 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 10:37:53.787784 master-0 kubenswrapper[7508]: I0313 10:37:53.787748 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8" Mar 13 10:37:53.790684 master-0 kubenswrapper[7508]: I0313 10:37:53.790613 7508 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984" exitCode=1 Mar 13 10:37:53.790769 master-0 kubenswrapper[7508]: I0313 10:37:53.790686 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984"} Mar 13 10:37:53.790918 master-0 kubenswrapper[7508]: I0313 10:37:53.790787 7508 scope.go:117] "RemoveContainer" containerID="6babec6a5c3649a6bea3ec1be171dc4161391ea03cf72605db3e897bf23d8b34" Mar 13 10:37:53.791273 master-0 kubenswrapper[7508]: I0313 10:37:53.791241 7508 scope.go:117] "RemoveContainer" containerID="476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984" Mar 13 10:37:53.794900 master-0 kubenswrapper[7508]: I0313 10:37:53.794653 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_046ee36d-4062-4c48-bab0-57381613b2ad/installer/0.log" Mar 13 10:37:53.794900 master-0 kubenswrapper[7508]: I0313 10:37:53.794703 7508 generic.go:334] "Generic (PLEG): container finished" podID="046ee36d-4062-4c48-bab0-57381613b2ad" containerID="a4f70fa035c3abd6f5af326fa3fcfdfa3c4b57e2f6aae8e90bf89bf8fa6d8b52" exitCode=1 Mar 13 10:37:53.794900 master-0 kubenswrapper[7508]: I0313 10:37:53.794736 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"046ee36d-4062-4c48-bab0-57381613b2ad","Type":"ContainerDied","Data":"a4f70fa035c3abd6f5af326fa3fcfdfa3c4b57e2f6aae8e90bf89bf8fa6d8b52"} Mar 13 10:37:54.801525 master-0 kubenswrapper[7508]: I0313 10:37:54.801474 7508 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569" exitCode=1 Mar 13 10:37:54.802135 master-0 kubenswrapper[7508]: I0313 10:37:54.801677 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569"} Mar 13 10:37:54.802706 master-0 kubenswrapper[7508]: I0313 10:37:54.802692 7508 scope.go:117] "RemoveContainer" containerID="c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569" Mar 13 10:37:55.816361 master-0 kubenswrapper[7508]: I0313 10:37:55.816228 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_046ee36d-4062-4c48-bab0-57381613b2ad/installer/0.log" Mar 13 10:37:55.816361 master-0 kubenswrapper[7508]: I0313 10:37:55.816308 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:55.817082 master-0 kubenswrapper[7508]: I0313 10:37:55.817017 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"6c3bc64f22f8c58f9e978db84c7754f9ee2b132931d3190f29d081554cf105af"} Mar 13 10:37:55.821687 master-0 kubenswrapper[7508]: I0313 10:37:55.821627 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_046ee36d-4062-4c48-bab0-57381613b2ad/installer/0.log" Mar 13 10:37:55.821687 master-0 kubenswrapper[7508]: I0313 10:37:55.821672 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"046ee36d-4062-4c48-bab0-57381613b2ad","Type":"ContainerDied","Data":"d95bfc7a6e7ed39f46a268281c89d6bfb2a11e840e00f0153aa1d414147f5319"} Mar 13 10:37:55.821865 master-0 kubenswrapper[7508]: I0313 10:37:55.821697 7508 scope.go:117] "RemoveContainer" containerID="a4f70fa035c3abd6f5af326fa3fcfdfa3c4b57e2f6aae8e90bf89bf8fa6d8b52" Mar 13 10:37:55.821865 master-0 kubenswrapper[7508]: I0313 10:37:55.821768 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 13 10:37:55.862294 master-0 kubenswrapper[7508]: E0313 10:37:55.862214 7508 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:37:55.980479 master-0 kubenswrapper[7508]: I0313 10:37:55.980425 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock\") pod \"046ee36d-4062-4c48-bab0-57381613b2ad\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " Mar 13 10:37:55.980590 master-0 kubenswrapper[7508]: I0313 10:37:55.980505 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir\") pod \"046ee36d-4062-4c48-bab0-57381613b2ad\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " Mar 13 10:37:55.980590 master-0 kubenswrapper[7508]: I0313 10:37:55.980585 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access\") pod \"046ee36d-4062-4c48-bab0-57381613b2ad\" (UID: \"046ee36d-4062-4c48-bab0-57381613b2ad\") " Mar 13 10:37:55.981149 master-0 kubenswrapper[7508]: I0313 10:37:55.981054 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "046ee36d-4062-4c48-bab0-57381613b2ad" (UID: "046ee36d-4062-4c48-bab0-57381613b2ad"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:55.982988 master-0 kubenswrapper[7508]: I0313 10:37:55.982953 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock" (OuterVolumeSpecName: "var-lock") pod "046ee36d-4062-4c48-bab0-57381613b2ad" (UID: "046ee36d-4062-4c48-bab0-57381613b2ad"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:55.987613 master-0 kubenswrapper[7508]: I0313 10:37:55.987423 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "046ee36d-4062-4c48-bab0-57381613b2ad" (UID: "046ee36d-4062-4c48-bab0-57381613b2ad"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:56.094638 master-0 kubenswrapper[7508]: I0313 10:37:56.093975 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/046ee36d-4062-4c48-bab0-57381613b2ad-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:56.094638 master-0 kubenswrapper[7508]: I0313 10:37:56.094023 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:56.094638 master-0 kubenswrapper[7508]: I0313 10:37:56.094040 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/046ee36d-4062-4c48-bab0-57381613b2ad-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:56.845428 master-0 kubenswrapper[7508]: I0313 10:37:56.845270 7508 generic.go:334] "Generic (PLEG): container finished" podID="257ae542-4a06-42d3-b3e8-bf0a376494a8" containerID="18b792e5b93f77cf52a60082b53bff347c1fb4352f7afe19baba67d3e0c88848" exitCode=0 Mar 13 10:37:56.846427 master-0 kubenswrapper[7508]: I0313 10:37:56.845523 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerDied","Data":"18b792e5b93f77cf52a60082b53bff347c1fb4352f7afe19baba67d3e0c88848"} Mar 13 10:37:56.847020 master-0 kubenswrapper[7508]: I0313 10:37:56.846998 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_6b488263-6a56-439c-945e-926936ed049d/installer/0.log" Mar 13 10:37:56.847072 master-0 kubenswrapper[7508]: I0313 10:37:56.847027 7508 generic.go:334] "Generic (PLEG): container finished" podID="6b488263-6a56-439c-945e-926936ed049d" containerID="cfc30e3ed734f4cb74033d3d0ab50e918052fd74c62e5f4931d21fcdfbcbd074" exitCode=1 Mar 13 10:37:56.847072 master-0 kubenswrapper[7508]: I0313 10:37:56.847065 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerDied","Data":"cfc30e3ed734f4cb74033d3d0ab50e918052fd74c62e5f4931d21fcdfbcbd074"} Mar 13 10:37:56.849500 master-0 kubenswrapper[7508]: I0313 10:37:56.849461 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerStarted","Data":"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309"} Mar 13 10:37:56.970726 master-0 kubenswrapper[7508]: I0313 10:37:56.970622 7508 generic.go:334] "Generic (PLEG): container finished" podID="f99b999c-4213-4d29-ab14-26c584e88445" containerID="b5d9c7e0055ba7e94e605d53781c97326170e75e394826099511e568c7ceef53" exitCode=0 Mar 13 10:37:56.970943 master-0 kubenswrapper[7508]: I0313 10:37:56.970737 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerDied","Data":"b5d9c7e0055ba7e94e605d53781c97326170e75e394826099511e568c7ceef53"} Mar 13 10:37:56.978448 master-0 kubenswrapper[7508]: I0313 10:37:56.978052 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"820c0c015259910a43a9d65233b6b59d3ff531e30b8ae70477184cc755d8b5d2"} Mar 13 10:37:56.991558 master-0 kubenswrapper[7508]: I0313 10:37:56.991432 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"8f137541b8024be9dec3a0e2a3bb479dfd8210f470244154f734979cdb98e7ff"} Mar 13 10:37:56.996398 master-0 kubenswrapper[7508]: I0313 10:37:56.996346 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe"} Mar 13 10:37:57.000466 master-0 kubenswrapper[7508]: I0313 10:37:57.000432 7508 generic.go:334] "Generic (PLEG): container finished" podID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerID="9acadc1e500a9e42a3e74f238a47ec5a1293ae1ab61e880a662f2b2f6b011cdf" exitCode=0 Mar 13 10:37:57.000555 master-0 kubenswrapper[7508]: I0313 10:37:57.000491 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbmz8" event={"ID":"1b072636-e46b-47f6-af85-3210e62bbd2d","Type":"ContainerDied","Data":"9acadc1e500a9e42a3e74f238a47ec5a1293ae1ab61e880a662f2b2f6b011cdf"} Mar 13 10:37:57.004182 master-0 kubenswrapper[7508]: I0313 10:37:57.004150 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258"} Mar 13 10:37:57.013157 master-0 kubenswrapper[7508]: I0313 10:37:57.012008 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerStarted","Data":"bda6d571a69475cffe984e819a7cc51ddb710348cfb7bd2636c19986e3e1d5ca"} Mar 13 10:37:57.018133 master-0 kubenswrapper[7508]: I0313 10:37:57.015151 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlpwf" event={"ID":"20df9416-90f4-4c21-a3bc-c6e5f6622e15","Type":"ContainerStarted","Data":"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e"} Mar 13 10:37:57.018133 master-0 kubenswrapper[7508]: I0313 10:37:57.015404 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-wlpwf" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-content" containerID="cri-o://f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e" gracePeriod=2 Mar 13 10:37:57.027415 master-0 kubenswrapper[7508]: I0313 10:37:57.024938 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"de680a22776cd5fe71b4b6d498091c7d353a1cf41ab4b46ddcaa37a48ad3bc06"} Mar 13 10:37:57.031113 master-0 kubenswrapper[7508]: I0313 10:37:57.027620 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"3671ca168b59df1b45e12ef956adf5651789dcef52410877c19c3c2f33c47060"} Mar 13 10:37:57.044118 master-0 kubenswrapper[7508]: I0313 10:37:57.039190 7508 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" exitCode=0 Mar 13 10:37:57.044118 master-0 kubenswrapper[7508]: I0313 10:37:57.039369 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f"} Mar 13 10:37:57.044118 master-0 kubenswrapper[7508]: I0313 10:37:57.042633 7508 generic.go:334] "Generic (PLEG): container finished" podID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerID="23014a10e46dd6d6965e2f7692313ca937c493926c4d143fea3625dfacd51f74" exitCode=0 Mar 13 10:37:57.044118 master-0 kubenswrapper[7508]: I0313 10:37:57.042712 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29dk6" event={"ID":"61b83fd7-2b78-42a9-9d93-0be3fd59a679","Type":"ContainerDied","Data":"23014a10e46dd6d6965e2f7692313ca937c493926c4d143fea3625dfacd51f74"} Mar 13 10:37:57.050281 master-0 kubenswrapper[7508]: I0313 10:37:57.049637 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerStarted","Data":"acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e"} Mar 13 10:37:57.056195 master-0 kubenswrapper[7508]: I0313 10:37:57.056142 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"706a8d0e60c2f5ca912ef3877380449fef368655b29cf505668eb09b4233133e"} Mar 13 10:37:57.062957 master-0 kubenswrapper[7508]: I0313 10:37:57.062804 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerStarted","Data":"c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb"} Mar 13 10:37:57.063163 master-0 kubenswrapper[7508]: I0313 10:37:57.063120 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:37:57.067566 master-0 kubenswrapper[7508]: I0313 10:37:57.067499 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhll" event={"ID":"1b57fa2d-b65e-4c69-97ce-4a379470d2de","Type":"ContainerStarted","Data":"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6"} Mar 13 10:37:57.072129 master-0 kubenswrapper[7508]: I0313 10:37:57.072079 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"4db43ea419f7842a2dbe6e4e76fd533d04eb0ced70cb2513c77273e29bfa971d"} Mar 13 10:37:57.072220 master-0 kubenswrapper[7508]: I0313 10:37:57.072136 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829"} Mar 13 10:37:57.081136 master-0 kubenswrapper[7508]: I0313 10:37:57.081058 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea"} Mar 13 10:37:57.084417 master-0 kubenswrapper[7508]: I0313 10:37:57.084351 7508 generic.go:334] "Generic (PLEG): container finished" podID="8b07c5ae-1149-4031-bd92-6df4331e586c" containerID="053d1c527d639c6703a290ef72056a864dded275336f60631cf170ecafc6976b" exitCode=0 Mar 13 10:37:57.084481 master-0 kubenswrapper[7508]: I0313 10:37:57.084428 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerDied","Data":"053d1c527d639c6703a290ef72056a864dded275336f60631cf170ecafc6976b"} Mar 13 10:37:57.106454 master-0 kubenswrapper[7508]: I0313 10:37:57.106118 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"6b4220f271a2b153bc0e77946705d348742d71cd7644e3f17d99cbdeff70f16f"} Mar 13 10:37:57.330795 master-0 kubenswrapper[7508]: I0313 10:37:57.328534 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:37:57.460214 master-0 kubenswrapper[7508]: I0313 10:37:57.459576 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content\") pod \"1b072636-e46b-47f6-af85-3210e62bbd2d\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " Mar 13 10:37:57.460214 master-0 kubenswrapper[7508]: I0313 10:37:57.459661 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gk52d\" (UniqueName: \"kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d\") pod \"1b072636-e46b-47f6-af85-3210e62bbd2d\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " Mar 13 10:37:57.460214 master-0 kubenswrapper[7508]: I0313 10:37:57.459717 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities\") pod \"1b072636-e46b-47f6-af85-3210e62bbd2d\" (UID: \"1b072636-e46b-47f6-af85-3210e62bbd2d\") " Mar 13 10:37:57.464660 master-0 kubenswrapper[7508]: I0313 10:37:57.464610 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d" (OuterVolumeSpecName: "kube-api-access-gk52d") pod "1b072636-e46b-47f6-af85-3210e62bbd2d" (UID: "1b072636-e46b-47f6-af85-3210e62bbd2d"). InnerVolumeSpecName "kube-api-access-gk52d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:57.465329 master-0 kubenswrapper[7508]: I0313 10:37:57.465291 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities" (OuterVolumeSpecName: "utilities") pod "1b072636-e46b-47f6-af85-3210e62bbd2d" (UID: "1b072636-e46b-47f6-af85-3210e62bbd2d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.478469 master-0 kubenswrapper[7508]: I0313 10:37:57.478347 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:37:57.488600 master-0 kubenswrapper[7508]: I0313 10:37:57.488207 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wlpwf_20df9416-90f4-4c21-a3bc-c6e5f6622e15/extract-content/0.log" Mar 13 10:37:57.489263 master-0 kubenswrapper[7508]: I0313 10:37:57.488919 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:37:57.491630 master-0 kubenswrapper[7508]: I0313 10:37:57.491559 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:57.503537 master-0 kubenswrapper[7508]: I0313 10:37:57.503023 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b072636-e46b-47f6-af85-3210e62bbd2d" (UID: "1b072636-e46b-47f6-af85-3210e62bbd2d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.565107 master-0 kubenswrapper[7508]: I0313 10:37:57.565031 7508 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.565289 master-0 kubenswrapper[7508]: I0313 10:37:57.565127 7508 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b072636-e46b-47f6-af85-3210e62bbd2d-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.565289 master-0 kubenswrapper[7508]: I0313 10:37:57.565145 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gk52d\" (UniqueName: \"kubernetes.io/projected/1b072636-e46b-47f6-af85-3210e62bbd2d-kube-api-access-gk52d\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.666657 master-0 kubenswrapper[7508]: I0313 10:37:57.666573 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content\") pod \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " Mar 13 10:37:57.666759 master-0 kubenswrapper[7508]: I0313 10:37:57.666699 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hmx8s\" (UniqueName: \"kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s\") pod \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " Mar 13 10:37:57.666803 master-0 kubenswrapper[7508]: I0313 10:37:57.666756 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-msv2k\" (UniqueName: \"kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k\") pod \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " Mar 13 10:37:57.666905 master-0 kubenswrapper[7508]: I0313 10:37:57.666874 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content\") pod \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " Mar 13 10:37:57.666972 master-0 kubenswrapper[7508]: I0313 10:37:57.666952 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities\") pod \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\" (UID: \"1b57fa2d-b65e-4c69-97ce-4a379470d2de\") " Mar 13 10:37:57.667021 master-0 kubenswrapper[7508]: I0313 10:37:57.666988 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities\") pod \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " Mar 13 10:37:57.667065 master-0 kubenswrapper[7508]: I0313 10:37:57.667051 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities\") pod \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " Mar 13 10:37:57.667132 master-0 kubenswrapper[7508]: I0313 10:37:57.667077 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ps4tc\" (UniqueName: \"kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc\") pod \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\" (UID: \"20df9416-90f4-4c21-a3bc-c6e5f6622e15\") " Mar 13 10:37:57.667178 master-0 kubenswrapper[7508]: I0313 10:37:57.667157 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content\") pod \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\" (UID: \"61b83fd7-2b78-42a9-9d93-0be3fd59a679\") " Mar 13 10:37:57.674763 master-0 kubenswrapper[7508]: I0313 10:37:57.674695 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s" (OuterVolumeSpecName: "kube-api-access-hmx8s") pod "1b57fa2d-b65e-4c69-97ce-4a379470d2de" (UID: "1b57fa2d-b65e-4c69-97ce-4a379470d2de"). InnerVolumeSpecName "kube-api-access-hmx8s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:57.676026 master-0 kubenswrapper[7508]: I0313 10:37:57.675754 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities" (OuterVolumeSpecName: "utilities") pod "20df9416-90f4-4c21-a3bc-c6e5f6622e15" (UID: "20df9416-90f4-4c21-a3bc-c6e5f6622e15"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.676026 master-0 kubenswrapper[7508]: I0313 10:37:57.675960 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities" (OuterVolumeSpecName: "utilities") pod "61b83fd7-2b78-42a9-9d93-0be3fd59a679" (UID: "61b83fd7-2b78-42a9-9d93-0be3fd59a679"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.676187 master-0 kubenswrapper[7508]: I0313 10:37:57.676031 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k" (OuterVolumeSpecName: "kube-api-access-msv2k") pod "61b83fd7-2b78-42a9-9d93-0be3fd59a679" (UID: "61b83fd7-2b78-42a9-9d93-0be3fd59a679"). InnerVolumeSpecName "kube-api-access-msv2k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:57.676187 master-0 kubenswrapper[7508]: I0313 10:37:57.676134 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities" (OuterVolumeSpecName: "utilities") pod "1b57fa2d-b65e-4c69-97ce-4a379470d2de" (UID: "1b57fa2d-b65e-4c69-97ce-4a379470d2de"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.680739 master-0 kubenswrapper[7508]: I0313 10:37:57.680677 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc" (OuterVolumeSpecName: "kube-api-access-ps4tc") pod "20df9416-90f4-4c21-a3bc-c6e5f6622e15" (UID: "20df9416-90f4-4c21-a3bc-c6e5f6622e15"). InnerVolumeSpecName "kube-api-access-ps4tc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822349 7508 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822398 7508 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822412 7508 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-utilities\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822426 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ps4tc\" (UniqueName: \"kubernetes.io/projected/20df9416-90f4-4c21-a3bc-c6e5f6622e15-kube-api-access-ps4tc\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822438 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hmx8s\" (UniqueName: \"kubernetes.io/projected/1b57fa2d-b65e-4c69-97ce-4a379470d2de-kube-api-access-hmx8s\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.822658 master-0 kubenswrapper[7508]: I0313 10:37:57.822450 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-msv2k\" (UniqueName: \"kubernetes.io/projected/61b83fd7-2b78-42a9-9d93-0be3fd59a679-kube-api-access-msv2k\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.876577 master-0 kubenswrapper[7508]: I0313 10:37:57.876501 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1b57fa2d-b65e-4c69-97ce-4a379470d2de" (UID: "1b57fa2d-b65e-4c69-97ce-4a379470d2de"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.906797 master-0 kubenswrapper[7508]: I0313 10:37:57.906714 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "20df9416-90f4-4c21-a3bc-c6e5f6622e15" (UID: "20df9416-90f4-4c21-a3bc-c6e5f6622e15"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.912263 master-0 kubenswrapper[7508]: I0313 10:37:57.912088 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "61b83fd7-2b78-42a9-9d93-0be3fd59a679" (UID: "61b83fd7-2b78-42a9-9d93-0be3fd59a679"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:37:57.923392 master-0 kubenswrapper[7508]: I0313 10:37:57.923369 7508 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1b57fa2d-b65e-4c69-97ce-4a379470d2de-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.923489 master-0 kubenswrapper[7508]: I0313 10:37:57.923478 7508 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/61b83fd7-2b78-42a9-9d93-0be3fd59a679-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:57.923579 master-0 kubenswrapper[7508]: I0313 10:37:57.923567 7508 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/20df9416-90f4-4c21-a3bc-c6e5f6622e15-catalog-content\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:58.073020 master-0 kubenswrapper[7508]: I0313 10:37:58.072824 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:37:58.073020 master-0 kubenswrapper[7508]: I0313 10:37:58.072929 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:37:58.115906 master-0 kubenswrapper[7508]: I0313 10:37:58.115848 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-vmt9j_d6fbad53-304a-4338-974e-d9974921c48f/kube-rbac-proxy/0.log" Mar 13 10:37:58.116806 master-0 kubenswrapper[7508]: I0313 10:37:58.116764 7508 generic.go:334] "Generic (PLEG): container finished" podID="d6fbad53-304a-4338-974e-d9974921c48f" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" exitCode=1 Mar 13 10:37:58.117010 master-0 kubenswrapper[7508]: I0313 10:37:58.116916 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerDied","Data":"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6"} Mar 13 10:37:58.117010 master-0 kubenswrapper[7508]: I0313 10:37:58.116988 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerStarted","Data":"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030"} Mar 13 10:37:58.117411 master-0 kubenswrapper[7508]: I0313 10:37:58.117372 7508 scope.go:117] "RemoveContainer" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:37:58.120081 master-0 kubenswrapper[7508]: I0313 10:37:58.120035 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"7ca6a95d1c17626751cc95fa8484019b6c82c228421de958d7300514a2ca3f13"} Mar 13 10:37:58.122233 master-0 kubenswrapper[7508]: I0313 10:37:58.122194 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-cbmz8" event={"ID":"1b072636-e46b-47f6-af85-3210e62bbd2d","Type":"ContainerDied","Data":"7cc1254634b76ed578784e1acad8caa82c7b3cc646671ebb3835307434f88d23"} Mar 13 10:37:58.122321 master-0 kubenswrapper[7508]: I0313 10:37:58.122236 7508 scope.go:117] "RemoveContainer" containerID="9acadc1e500a9e42a3e74f238a47ec5a1293ae1ab61e880a662f2b2f6b011cdf" Mar 13 10:37:58.122392 master-0 kubenswrapper[7508]: I0313 10:37:58.122369 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-cbmz8" Mar 13 10:37:58.127860 master-0 kubenswrapper[7508]: I0313 10:37:58.127819 7508 generic.go:334] "Generic (PLEG): container finished" podID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerID="b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6" exitCode=0 Mar 13 10:37:58.127976 master-0 kubenswrapper[7508]: I0313 10:37:58.127909 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-dbhll" Mar 13 10:37:58.128436 master-0 kubenswrapper[7508]: I0313 10:37:58.128311 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhll" event={"ID":"1b57fa2d-b65e-4c69-97ce-4a379470d2de","Type":"ContainerDied","Data":"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6"} Mar 13 10:37:58.128436 master-0 kubenswrapper[7508]: I0313 10:37:58.128395 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-dbhll" event={"ID":"1b57fa2d-b65e-4c69-97ce-4a379470d2de","Type":"ContainerDied","Data":"11c2c30e0a3283728cc95581ddad85c479ad7dd17380bf08f4e4ddbd96d47244"} Mar 13 10:37:58.130704 master-0 kubenswrapper[7508]: I0313 10:37:58.130617 7508 generic.go:334] "Generic (PLEG): container finished" podID="2157cb66-d458-4353-bc9c-ef761e61e5c5" containerID="6b4220f271a2b153bc0e77946705d348742d71cd7644e3f17d99cbdeff70f16f" exitCode=0 Mar 13 10:37:58.130781 master-0 kubenswrapper[7508]: I0313 10:37:58.130688 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerDied","Data":"6b4220f271a2b153bc0e77946705d348742d71cd7644e3f17d99cbdeff70f16f"} Mar 13 10:37:58.136160 master-0 kubenswrapper[7508]: I0313 10:37:58.135431 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-29dk6" event={"ID":"61b83fd7-2b78-42a9-9d93-0be3fd59a679","Type":"ContainerDied","Data":"29b996efefea62a5520da7a1bea5d4af7c6b2e7ab4bada22cac5fe5c2c1aa4be"} Mar 13 10:37:58.136160 master-0 kubenswrapper[7508]: I0313 10:37:58.135816 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-29dk6" Mar 13 10:37:58.139304 master-0 kubenswrapper[7508]: I0313 10:37:58.137413 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-wlpwf_20df9416-90f4-4c21-a3bc-c6e5f6622e15/extract-content/0.log" Mar 13 10:37:58.139304 master-0 kubenswrapper[7508]: I0313 10:37:58.137893 7508 generic.go:334] "Generic (PLEG): container finished" podID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerID="f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e" exitCode=2 Mar 13 10:37:58.139304 master-0 kubenswrapper[7508]: I0313 10:37:58.137989 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlpwf" event={"ID":"20df9416-90f4-4c21-a3bc-c6e5f6622e15","Type":"ContainerDied","Data":"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e"} Mar 13 10:37:58.139304 master-0 kubenswrapper[7508]: I0313 10:37:58.138022 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-wlpwf" event={"ID":"20df9416-90f4-4c21-a3bc-c6e5f6622e15","Type":"ContainerDied","Data":"6962fe374e095b27f0316a8835e65d00f53256a1d8e385e98ec2f5caea44bbb2"} Mar 13 10:37:58.139304 master-0 kubenswrapper[7508]: I0313 10:37:58.137996 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-wlpwf" Mar 13 10:37:58.140793 master-0 kubenswrapper[7508]: I0313 10:37:58.140741 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerStarted","Data":"799c00d706ab085bdece95573540241444c883e9ee37d48b06d60922afea2895"} Mar 13 10:37:58.147732 master-0 kubenswrapper[7508]: I0313 10:37:58.147677 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerStarted","Data":"58d7404b838e4c314c4bb71f4fca18a37f75d33d03431ce85b9c2b50d05d498a"} Mar 13 10:37:58.786643 master-0 kubenswrapper[7508]: I0313 10:37:58.786606 7508 scope.go:117] "RemoveContainer" containerID="3defd4dddffc43465173ec71d65059b52bceb629972998b93ecfd713e6cbe46d" Mar 13 10:37:58.843650 master-0 kubenswrapper[7508]: I0313 10:37:58.843585 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:37:58.847905 master-0 kubenswrapper[7508]: I0313 10:37:58.847866 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_6b488263-6a56-439c-945e-926936ed049d/installer/0.log" Mar 13 10:37:58.847998 master-0 kubenswrapper[7508]: I0313 10:37:58.847931 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:58.851260 master-0 kubenswrapper[7508]: I0313 10:37:58.851223 7508 scope.go:117] "RemoveContainer" containerID="b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6" Mar 13 10:37:58.865502 master-0 kubenswrapper[7508]: I0313 10:37:58.865467 7508 scope.go:117] "RemoveContainer" containerID="f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f" Mar 13 10:37:58.885713 master-0 kubenswrapper[7508]: I0313 10:37:58.885668 7508 scope.go:117] "RemoveContainer" containerID="b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6" Mar 13 10:37:58.886266 master-0 kubenswrapper[7508]: E0313 10:37:58.886231 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6\": container with ID starting with b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6 not found: ID does not exist" containerID="b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6" Mar 13 10:37:58.886344 master-0 kubenswrapper[7508]: I0313 10:37:58.886280 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6"} err="failed to get container status \"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6\": rpc error: code = NotFound desc = could not find container \"b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6\": container with ID starting with b8c49ab249fa816bc7792b8b1147460b4615683e9221481eab321bbcabff19b6 not found: ID does not exist" Mar 13 10:37:58.886344 master-0 kubenswrapper[7508]: I0313 10:37:58.886310 7508 scope.go:117] "RemoveContainer" containerID="f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f" Mar 13 10:37:58.886554 master-0 kubenswrapper[7508]: E0313 10:37:58.886526 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f\": container with ID starting with f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f not found: ID does not exist" containerID="f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f" Mar 13 10:37:58.886645 master-0 kubenswrapper[7508]: I0313 10:37:58.886552 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f"} err="failed to get container status \"f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f\": rpc error: code = NotFound desc = could not find container \"f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f\": container with ID starting with f9ac562d358f18fcdb57f11521895f7b7594e9c7f3b79761472475715640f44f not found: ID does not exist" Mar 13 10:37:58.886645 master-0 kubenswrapper[7508]: I0313 10:37:58.886568 7508 scope.go:117] "RemoveContainer" containerID="23014a10e46dd6d6965e2f7692313ca937c493926c4d143fea3625dfacd51f74" Mar 13 10:37:58.902597 master-0 kubenswrapper[7508]: I0313 10:37:58.902552 7508 scope.go:117] "RemoveContainer" containerID="e006633100d465a175ba215dc8fcac40a2f8affb6e3cca4831dc0965bfd5291f" Mar 13 10:37:58.976554 master-0 kubenswrapper[7508]: I0313 10:37:58.976498 7508 scope.go:117] "RemoveContainer" containerID="f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e" Mar 13 10:37:58.989264 master-0 kubenswrapper[7508]: I0313 10:37:58.989230 7508 scope.go:117] "RemoveContainer" containerID="191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95" Mar 13 10:37:59.013948 master-0 kubenswrapper[7508]: I0313 10:37:59.013914 7508 scope.go:117] "RemoveContainer" containerID="f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e" Mar 13 10:37:59.014316 master-0 kubenswrapper[7508]: E0313 10:37:59.014288 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e\": container with ID starting with f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e not found: ID does not exist" containerID="f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e" Mar 13 10:37:59.014391 master-0 kubenswrapper[7508]: I0313 10:37:59.014327 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e"} err="failed to get container status \"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e\": rpc error: code = NotFound desc = could not find container \"f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e\": container with ID starting with f3fe32e80895555912a34695dfa1a9e1cd81f0dd30b84d7ef335248e7259440e not found: ID does not exist" Mar 13 10:37:59.014391 master-0 kubenswrapper[7508]: I0313 10:37:59.014384 7508 scope.go:117] "RemoveContainer" containerID="191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95" Mar 13 10:37:59.014735 master-0 kubenswrapper[7508]: E0313 10:37:59.014708 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95\": container with ID starting with 191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95 not found: ID does not exist" containerID="191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95" Mar 13 10:37:59.014830 master-0 kubenswrapper[7508]: I0313 10:37:59.014736 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95"} err="failed to get container status \"191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95\": rpc error: code = NotFound desc = could not find container \"191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95\": container with ID starting with 191d413efb84d0c5388a688a7f9402de7d281b60845fae64c75d8ff9439b6d95 not found: ID does not exist" Mar 13 10:37:59.071428 master-0 kubenswrapper[7508]: I0313 10:37:59.071356 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir\") pod \"6b488263-6a56-439c-945e-926936ed049d\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " Mar 13 10:37:59.071428 master-0 kubenswrapper[7508]: I0313 10:37:59.071426 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock\") pod \"6b488263-6a56-439c-945e-926936ed049d\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " Mar 13 10:37:59.071707 master-0 kubenswrapper[7508]: I0313 10:37:59.071481 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6b488263-6a56-439c-945e-926936ed049d" (UID: "6b488263-6a56-439c-945e-926936ed049d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:59.071707 master-0 kubenswrapper[7508]: I0313 10:37:59.071541 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access\") pod \"6b488263-6a56-439c-945e-926936ed049d\" (UID: \"6b488263-6a56-439c-945e-926936ed049d\") " Mar 13 10:37:59.071707 master-0 kubenswrapper[7508]: I0313 10:37:59.071600 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock" (OuterVolumeSpecName: "var-lock") pod "6b488263-6a56-439c-945e-926936ed049d" (UID: "6b488263-6a56-439c-945e-926936ed049d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:37:59.072002 master-0 kubenswrapper[7508]: I0313 10:37:59.071968 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:59.072083 master-0 kubenswrapper[7508]: I0313 10:37:59.072002 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6b488263-6a56-439c-945e-926936ed049d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:59.074590 master-0 kubenswrapper[7508]: I0313 10:37:59.074551 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6b488263-6a56-439c-945e-926936ed049d" (UID: "6b488263-6a56-439c-945e-926936ed049d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:37:59.148493 master-0 kubenswrapper[7508]: I0313 10:37:59.148403 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:37:59.148772 master-0 kubenswrapper[7508]: I0313 10:37:59.148508 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:37:59.171048 master-0 kubenswrapper[7508]: I0313 10:37:59.170951 7508 generic.go:334] "Generic (PLEG): container finished" podID="257ae542-4a06-42d3-b3e8-bf0a376494a8" containerID="799c00d706ab085bdece95573540241444c883e9ee37d48b06d60922afea2895" exitCode=0 Mar 13 10:37:59.171362 master-0 kubenswrapper[7508]: I0313 10:37:59.171064 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerDied","Data":"799c00d706ab085bdece95573540241444c883e9ee37d48b06d60922afea2895"} Mar 13 10:37:59.173150 master-0 kubenswrapper[7508]: I0313 10:37:59.173123 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6b488263-6a56-439c-945e-926936ed049d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:37:59.175251 master-0 kubenswrapper[7508]: I0313 10:37:59.175231 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_6b488263-6a56-439c-945e-926936ed049d/installer/0.log" Mar 13 10:37:59.175388 master-0 kubenswrapper[7508]: I0313 10:37:59.175369 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerDied","Data":"f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a"} Mar 13 10:37:59.175448 master-0 kubenswrapper[7508]: I0313 10:37:59.175394 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a" Mar 13 10:37:59.175448 master-0 kubenswrapper[7508]: I0313 10:37:59.175399 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:37:59.181845 master-0 kubenswrapper[7508]: I0313 10:37:59.181767 7508 generic.go:334] "Generic (PLEG): container finished" podID="f99b999c-4213-4d29-ab14-26c584e88445" containerID="58d7404b838e4c314c4bb71f4fca18a37f75d33d03431ce85b9c2b50d05d498a" exitCode=0 Mar 13 10:37:59.181845 master-0 kubenswrapper[7508]: I0313 10:37:59.181841 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerDied","Data":"58d7404b838e4c314c4bb71f4fca18a37f75d33d03431ce85b9c2b50d05d498a"} Mar 13 10:38:00.267307 master-0 kubenswrapper[7508]: I0313 10:38:00.267215 7508 generic.go:334] "Generic (PLEG): container finished" podID="8b07c5ae-1149-4031-bd92-6df4331e586c" containerID="fc95bff32f2114b905d9fbe18892b7b039189a377e939c5fcb424714913dd15f" exitCode=0 Mar 13 10:38:00.267307 master-0 kubenswrapper[7508]: I0313 10:38:00.267312 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerDied","Data":"fc95bff32f2114b905d9fbe18892b7b039189a377e939c5fcb424714913dd15f"} Mar 13 10:38:00.269071 master-0 kubenswrapper[7508]: I0313 10:38:00.269044 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-vmt9j_d6fbad53-304a-4338-974e-d9974921c48f/kube-rbac-proxy/0.log" Mar 13 10:38:00.270421 master-0 kubenswrapper[7508]: I0313 10:38:00.270247 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerStarted","Data":"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c"} Mar 13 10:38:00.272356 master-0 kubenswrapper[7508]: I0313 10:38:00.272286 7508 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="2b53706ef774eb15c126f57be58e4c0c9f005142fd0e9af295b43871ae8de7ef" exitCode=0 Mar 13 10:38:00.275393 master-0 kubenswrapper[7508]: I0313 10:38:00.275345 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"8629ec87935b9c8163acca5e90c43ffc35598371cd514995496e1b481f1cd153"} Mar 13 10:38:01.998068 master-0 kubenswrapper[7508]: I0313 10:38:01.997978 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:38:02.950499 master-0 kubenswrapper[7508]: I0313 10:38:02.950417 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:02.950742 master-0 kubenswrapper[7508]: I0313 10:38:02.950581 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:02.953320 master-0 kubenswrapper[7508]: I0313 10:38:02.953242 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:02.953413 master-0 kubenswrapper[7508]: I0313 10:38:02.953338 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:03.427271 master-0 kubenswrapper[7508]: I0313 10:38:03.427147 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 13 10:38:03.427271 master-0 kubenswrapper[7508]: I0313 10:38:03.427213 7508 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="fc59a335ab92b5426116aa2f5adb31266760392f014df421d723f95bb6f6ebfb" exitCode=137 Mar 13 10:38:03.660557 master-0 kubenswrapper[7508]: I0313 10:38:03.660522 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 13 10:38:03.660701 master-0 kubenswrapper[7508]: I0313 10:38:03.660600 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:38:03.695526 master-0 kubenswrapper[7508]: I0313 10:38:03.695386 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 13 10:38:03.695526 master-0 kubenswrapper[7508]: I0313 10:38:03.695476 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 13 10:38:03.695812 master-0 kubenswrapper[7508]: I0313 10:38:03.695512 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:38:03.695812 master-0 kubenswrapper[7508]: I0313 10:38:03.695698 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:38:03.696050 master-0 kubenswrapper[7508]: I0313 10:38:03.696024 7508 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:38:03.696132 master-0 kubenswrapper[7508]: I0313 10:38:03.696054 7508 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:38:04.436657 master-0 kubenswrapper[7508]: I0313 10:38:04.436599 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 13 10:38:04.436657 master-0 kubenswrapper[7508]: I0313 10:38:04.436676 7508 scope.go:117] "RemoveContainer" containerID="2b53706ef774eb15c126f57be58e4c0c9f005142fd0e9af295b43871ae8de7ef" Mar 13 10:38:04.437504 master-0 kubenswrapper[7508]: I0313 10:38:04.436736 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:38:04.452271 master-0 kubenswrapper[7508]: I0313 10:38:04.452221 7508 scope.go:117] "RemoveContainer" containerID="fc59a335ab92b5426116aa2f5adb31266760392f014df421d723f95bb6f6ebfb" Mar 13 10:38:04.999078 master-0 kubenswrapper[7508]: I0313 10:38:04.998955 7508 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:05.506473 master-0 kubenswrapper[7508]: I0313 10:38:05.506410 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 13 10:38:05.507020 master-0 kubenswrapper[7508]: I0313 10:38:05.506810 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:38:05.863206 master-0 kubenswrapper[7508]: E0313 10:38:05.863025 7508 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:06.739056 master-0 kubenswrapper[7508]: I0313 10:38:06.738964 7508 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\\\"},\\\"status\\\":{\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:00b591b3820682dc99f16f07a3a0a4ec06dfedba63cd0f79b998ac4509fabea3\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-autoscaler-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/cluster-autoscaler-operator/tls\\\",\\\"name\\\":\\\"cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chxxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"auth-proxy-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-chxxr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-machine-api\"/\"cluster-autoscaler-operator-69576476f7-p7qlt\": Timeout: request did not complete within requested timeout - context deadline exceeded" Mar 13 10:38:06.749683 master-0 kubenswrapper[7508]: E0313 10:38:06.749487 7508 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189c6050a39b0dd0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:37:32.7356308 +0000 UTC m=+91.478455917,LastTimestamp:2026-03-13 10:37:32.7356308 +0000 UTC m=+91.478455917,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:38:10.049216 master-0 kubenswrapper[7508]: E0313 10:38:10.048674 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 10:38:11.227387 master-0 kubenswrapper[7508]: I0313 10:38:11.227033 7508 generic.go:334] "Generic (PLEG): container finished" podID="53da2840-4a92-497a-a9d3-973583887147" containerID="c24269090669e540d849b1a7ede32ee9641b8d7335ec065d4a9e4c4317788e00" exitCode=0 Mar 13 10:38:12.318152 master-0 kubenswrapper[7508]: I0313 10:38:12.318055 7508 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-t2xfz container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 10:38:12.318724 master-0 kubenswrapper[7508]: I0313 10:38:12.318193 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" podUID="0932314b-ccf5-4be5-99f8-b99886392daa" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 10:38:12.557722 master-0 kubenswrapper[7508]: I0313 10:38:12.557551 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:12.557722 master-0 kubenswrapper[7508]: I0313 10:38:12.557582 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:12.557722 master-0 kubenswrapper[7508]: I0313 10:38:12.557628 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:12.557722 master-0 kubenswrapper[7508]: I0313 10:38:12.557653 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:14.898525 master-0 kubenswrapper[7508]: E0313 10:38:14.898354 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:38:04Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:38:04Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:38:04Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:38:04Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e9ee63a30a9b95b5801afa36e09fc583ec2cda3c5cb3c8676e478fea016abfa1\\\"],\\\"sizeBytes\\\":470680779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ebee49810f493f9b566740bd61256fd40b897cc51423f1efa01a02bb57ce177d\\\"],\\\"sizeBytes\\\":467234714},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"],\\\"sizeBytes\\\":438654375}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:14.999154 master-0 kubenswrapper[7508]: I0313 10:38:14.999036 7508 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:15.864236 master-0 kubenswrapper[7508]: E0313 10:38:15.863885 7508 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:16.259265 master-0 kubenswrapper[7508]: I0313 10:38:16.259143 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-t2xfz_0932314b-ccf5-4be5-99f8-b99886392daa/etcd-operator/0.log" Mar 13 10:38:16.259265 master-0 kubenswrapper[7508]: I0313 10:38:16.259196 7508 generic.go:334] "Generic (PLEG): container finished" podID="0932314b-ccf5-4be5-99f8-b99886392daa" containerID="b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe" exitCode=0 Mar 13 10:38:16.261030 master-0 kubenswrapper[7508]: I0313 10:38:16.260997 7508 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" exitCode=0 Mar 13 10:38:16.262300 master-0 kubenswrapper[7508]: I0313 10:38:16.262274 7508 generic.go:334] "Generic (PLEG): container finished" podID="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" containerID="e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e" exitCode=0 Mar 13 10:38:22.590807 master-0 kubenswrapper[7508]: I0313 10:38:22.590688 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:22.591570 master-0 kubenswrapper[7508]: I0313 10:38:22.590850 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:22.591708 master-0 kubenswrapper[7508]: I0313 10:38:22.591667 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:22.591789 master-0 kubenswrapper[7508]: I0313 10:38:22.591710 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:24.900320 master-0 kubenswrapper[7508]: E0313 10:38:24.900221 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 13 10:38:24.998133 master-0 kubenswrapper[7508]: I0313 10:38:24.997978 7508 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:25.864824 master-0 kubenswrapper[7508]: E0313 10:38:25.864731 7508 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:25.864824 master-0 kubenswrapper[7508]: I0313 10:38:25.864807 7508 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 10:38:29.272721 master-0 kubenswrapper[7508]: E0313 10:38:29.272622 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 10:38:29.379376 master-0 kubenswrapper[7508]: I0313 10:38:29.379270 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/0.log" Mar 13 10:38:29.380359 master-0 kubenswrapper[7508]: I0313 10:38:29.380260 7508 generic.go:334] "Generic (PLEG): container finished" podID="a3c91eef-ec46-419f-b418-ac3a8094b77d" containerID="d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248" exitCode=1 Mar 13 10:38:30.389320 master-0 kubenswrapper[7508]: I0313 10:38:30.389234 7508 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" exitCode=0 Mar 13 10:38:32.558391 master-0 kubenswrapper[7508]: I0313 10:38:32.557699 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:32.559168 master-0 kubenswrapper[7508]: I0313 10:38:32.558517 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:34.900919 master-0 kubenswrapper[7508]: E0313 10:38:34.900848 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:35.865798 master-0 kubenswrapper[7508]: E0313 10:38:35.865523 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 13 10:38:38.438503 master-0 kubenswrapper[7508]: I0313 10:38:38.438398 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-b2ss8_cf740515-d70d-44b6-ac00-21143b5494d1/ingress-operator/0.log" Mar 13 10:38:38.438503 master-0 kubenswrapper[7508]: I0313 10:38:38.438463 7508 generic.go:334] "Generic (PLEG): container finished" podID="cf740515-d70d-44b6-ac00-21143b5494d1" containerID="1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748" exitCode=1 Mar 13 10:38:39.638856 master-0 kubenswrapper[7508]: E0313 10:38:39.638793 7508 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:38:39.639383 master-0 kubenswrapper[7508]: E0313 10:38:39.639078 7508 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.14s" Mar 13 10:38:39.639383 master-0 kubenswrapper[7508]: I0313 10:38:39.639157 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:38:39.639383 master-0 kubenswrapper[7508]: I0313 10:38:39.639239 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:38:39.648306 master-0 kubenswrapper[7508]: I0313 10:38:39.648201 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:38:40.753629 master-0 kubenswrapper[7508]: E0313 10:38:40.753264 7508 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{installer-2-master-0.189c6050c576bdba openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:installer-2-master-0,UID:c834b554-c652-4f45-9110-3d4e260ba98a,APIVersion:v1,ResourceVersion:8823,FieldPath:spec.containers{installer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:37:33.303676346 +0000 UTC m=+92.046501453,LastTimestamp:2026-03-13 10:37:33.303676346 +0000 UTC m=+92.046501453,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:38:42.557856 master-0 kubenswrapper[7508]: I0313 10:38:42.557779 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:42.558714 master-0 kubenswrapper[7508]: I0313 10:38:42.558658 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:43.478137 master-0 kubenswrapper[7508]: I0313 10:38:43.477984 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_1e86a3b0-37b3-4df1-a522-f29cda076753/installer/0.log" Mar 13 10:38:43.478137 master-0 kubenswrapper[7508]: I0313 10:38:43.478068 7508 generic.go:334] "Generic (PLEG): container finished" podID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerID="d19b978c1e8101a0212df3b6611d9d31aa1e8b34d80df670a9b5c7dd94abdbf2" exitCode=1 Mar 13 10:38:44.902050 master-0 kubenswrapper[7508]: E0313 10:38:44.901962 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 13 10:38:46.067425 master-0 kubenswrapper[7508]: E0313 10:38:46.067309 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 13 10:38:52.558950 master-0 kubenswrapper[7508]: I0313 10:38:52.558842 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:38:52.558950 master-0 kubenswrapper[7508]: I0313 10:38:52.558943 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:54.902322 master-0 kubenswrapper[7508]: E0313 10:38:54.902230 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:38:54.902322 master-0 kubenswrapper[7508]: E0313 10:38:54.902285 7508 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 10:38:56.469210 master-0 kubenswrapper[7508]: E0313 10:38:56.469053 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 13 10:38:57.566497 master-0 kubenswrapper[7508]: I0313 10:38:57.566422 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-85b658d7fb-45fq6_97328e01-1227-417e-9af7-6426495d96db/packageserver/0.log" Mar 13 10:38:57.566497 master-0 kubenswrapper[7508]: I0313 10:38:57.566482 7508 generic.go:334] "Generic (PLEG): container finished" podID="97328e01-1227-417e-9af7-6426495d96db" containerID="c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb" exitCode=2 Mar 13 10:38:57.569951 master-0 kubenswrapper[7508]: I0313 10:38:57.569777 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/0.log" Mar 13 10:38:57.569951 master-0 kubenswrapper[7508]: I0313 10:38:57.569848 7508 generic.go:334] "Generic (PLEG): container finished" podID="0881de70-2db3-4fc2-b976-b55c11dc239d" containerID="d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829" exitCode=1 Mar 13 10:38:58.578643 master-0 kubenswrapper[7508]: I0313 10:38:58.578556 7508 generic.go:334] "Generic (PLEG): container finished" podID="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" containerID="acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e" exitCode=0 Mar 13 10:38:58.580963 master-0 kubenswrapper[7508]: I0313 10:38:58.580875 7508 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe" exitCode=1 Mar 13 10:39:01.558581 master-0 kubenswrapper[7508]: I0313 10:39:01.558372 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:01.558581 master-0 kubenswrapper[7508]: I0313 10:39:01.558574 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:06.741625 master-0 kubenswrapper[7508]: I0313 10:39:06.741498 7508 status_manager.go:851] "Failed to get status for pod" podUID="8b07c5ae-1149-4031-bd92-6df4331e586c" pod="openshift-marketplace/community-operators-lhqzl" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods community-operators-lhqzl)" Mar 13 10:39:07.270750 master-0 kubenswrapper[7508]: E0313 10:39:07.270431 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 13 10:39:11.557414 master-0 kubenswrapper[7508]: I0313 10:39:11.557328 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:11.558205 master-0 kubenswrapper[7508]: I0313 10:39:11.557455 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:12.318771 master-0 kubenswrapper[7508]: I0313 10:39:12.318673 7508 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-t2xfz container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" start-of-body= Mar 13 10:39:12.319238 master-0 kubenswrapper[7508]: I0313 10:39:12.318785 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" podUID="0932314b-ccf5-4be5-99f8-b99886392daa" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.19:8443/healthz\": dial tcp 10.128.0.19:8443: connect: connection refused" Mar 13 10:39:13.652998 master-0 kubenswrapper[7508]: E0313 10:39:13.652886 7508 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:39:13.653787 master-0 kubenswrapper[7508]: E0313 10:39:13.653212 7508 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.014s" Mar 13 10:39:13.653787 master-0 kubenswrapper[7508]: I0313 10:39:13.653263 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:39:13.653787 master-0 kubenswrapper[7508]: I0313 10:39:13.653310 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerDied","Data":"c24269090669e540d849b1a7ede32ee9641b8d7335ec065d4a9e4c4317788e00"} Mar 13 10:39:13.655022 master-0 kubenswrapper[7508]: I0313 10:39:13.654761 7508 scope.go:117] "RemoveContainer" containerID="c24269090669e540d849b1a7ede32ee9641b8d7335ec065d4a9e4c4317788e00" Mar 13 10:39:13.688307 master-0 kubenswrapper[7508]: I0313 10:39:13.688252 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:39:14.760023 master-0 kubenswrapper[7508]: E0313 10:39:14.759745 7508 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{packageserver-85b658d7fb-45fq6.189c6050c5ec0624 openshift-operator-lifecycle-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-operator-lifecycle-manager,Name:packageserver-85b658d7fb-45fq6,UID:97328e01-1227-417e-9af7-6426495d96db,APIVersion:v1,ResourceVersion:9393,FieldPath:spec.containers{packageserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:37:33.311362596 +0000 UTC m=+92.054187713,LastTimestamp:2026-03-13 10:37:33.311362596 +0000 UTC m=+92.054187713,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:39:15.096764 master-0 kubenswrapper[7508]: E0313 10:39:15.096430 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:39:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:39:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:39:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:39:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8c978bb5c329452b181f61f00452b4c2bfd83d245db56050bc7607972a791a76\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:e6567accc084db971e077b5ca666357e3a326fa27f69fc7135a5bc2e19f998eb\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221745369},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e9ee63a30a9b95b5801afa36e09fc583ec2cda3c5cb3c8676e478fea016abfa1\\\"],\\\"sizeBytes\\\":470680779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:39:18.872961 master-0 kubenswrapper[7508]: E0313 10:39:18.872211 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 13 10:39:21.558947 master-0 kubenswrapper[7508]: I0313 10:39:21.558831 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:21.560744 master-0 kubenswrapper[7508]: I0313 10:39:21.558949 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:25.097954 master-0 kubenswrapper[7508]: E0313 10:39:25.097837 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:39:27.789540 master-0 kubenswrapper[7508]: I0313 10:39:27.789321 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-z9wrg_8d2fdba3-9478-4165-9207-d01483625607/network-operator/1.log" Mar 13 10:39:27.790612 master-0 kubenswrapper[7508]: I0313 10:39:27.790035 7508 generic.go:334] "Generic (PLEG): container finished" podID="8d2fdba3-9478-4165-9207-d01483625607" containerID="1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258" exitCode=255 Mar 13 10:39:27.792543 master-0 kubenswrapper[7508]: I0313 10:39:27.792485 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-wz9t2_b57f1c19-f44a-4405-8135-79aef1d1ce07/cluster-storage-operator/0.log" Mar 13 10:39:27.792543 master-0 kubenswrapper[7508]: I0313 10:39:27.792529 7508 generic.go:334] "Generic (PLEG): container finished" podID="b57f1c19-f44a-4405-8135-79aef1d1ce07" containerID="6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea" exitCode=255 Mar 13 10:39:28.802246 master-0 kubenswrapper[7508]: I0313 10:39:28.801999 7508 generic.go:334] "Generic (PLEG): container finished" podID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerID="a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f" exitCode=0 Mar 13 10:39:31.558108 master-0 kubenswrapper[7508]: I0313 10:39:31.557985 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:31.558108 master-0 kubenswrapper[7508]: I0313 10:39:31.558072 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:32.075234 master-0 kubenswrapper[7508]: E0313 10:39:32.074344 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 13 10:39:34.561637 master-0 kubenswrapper[7508]: I0313 10:39:34.561570 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:34.562635 master-0 kubenswrapper[7508]: I0313 10:39:34.562565 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:34.562940 master-0 kubenswrapper[7508]: I0313 10:39:34.561596 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:34.563249 master-0 kubenswrapper[7508]: I0313 10:39:34.563199 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:35.098764 master-0 kubenswrapper[7508]: E0313 10:39:35.098655 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:39:41.558016 master-0 kubenswrapper[7508]: I0313 10:39:41.557934 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:41.558684 master-0 kubenswrapper[7508]: I0313 10:39:41.558033 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:44.561735 master-0 kubenswrapper[7508]: I0313 10:39:44.561607 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:44.561735 master-0 kubenswrapper[7508]: I0313 10:39:44.561715 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:44.562903 master-0 kubenswrapper[7508]: I0313 10:39:44.561797 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:44.562903 master-0 kubenswrapper[7508]: I0313 10:39:44.561900 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:44.917906 master-0 kubenswrapper[7508]: I0313 10:39:44.917805 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/0.log" Mar 13 10:39:44.918544 master-0 kubenswrapper[7508]: I0313 10:39:44.918489 7508 generic.go:334] "Generic (PLEG): container finished" podID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerID="84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c" exitCode=1 Mar 13 10:39:44.921626 master-0 kubenswrapper[7508]: I0313 10:39:44.921592 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/0.log" Mar 13 10:39:44.921626 master-0 kubenswrapper[7508]: I0313 10:39:44.921625 7508 generic.go:334] "Generic (PLEG): container finished" podID="ec33c506-8abe-4659-84d3-a294c31b446c" containerID="eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946" exitCode=1 Mar 13 10:39:44.923684 master-0 kubenswrapper[7508]: I0313 10:39:44.923633 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/0.log" Mar 13 10:39:44.923812 master-0 kubenswrapper[7508]: I0313 10:39:44.923693 7508 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e" exitCode=1 Mar 13 10:39:45.099631 master-0 kubenswrapper[7508]: E0313 10:39:45.099526 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:39:45.959743 master-0 kubenswrapper[7508]: I0313 10:39:45.959628 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:39:45.959743 master-0 kubenswrapper[7508]: I0313 10:39:45.959717 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:39:46.045842 master-0 kubenswrapper[7508]: I0313 10:39:46.045735 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:39:46.046241 master-0 kubenswrapper[7508]: I0313 10:39:46.045846 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:39:47.692142 master-0 kubenswrapper[7508]: E0313 10:39:47.691983 7508 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: E0313 10:39:47.692319 7508 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.039s" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.692366 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerDied","Data":"b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe"} Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.692454 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.692477 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f"} Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.692512 7508 scope.go:117] "RemoveContainer" containerID="fba0ad5a7ea5359314eabe4a73e6d377274ab61d90c33e03f2dabdbba3678155" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.692716 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.693041 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:39:47.693441 master-0 kubenswrapper[7508]: I0313 10:39:47.693132 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerDied","Data":"e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e"} Mar 13 10:39:47.694402 master-0 kubenswrapper[7508]: I0313 10:39:47.693823 7508 scope.go:117] "RemoveContainer" containerID="b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe" Mar 13 10:39:47.695642 master-0 kubenswrapper[7508]: I0313 10:39:47.695189 7508 scope.go:117] "RemoveContainer" containerID="e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e" Mar 13 10:39:47.695642 master-0 kubenswrapper[7508]: I0313 10:39:47.695298 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:39:47.695642 master-0 kubenswrapper[7508]: I0313 10:39:47.695381 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:39:47.703439 master-0 kubenswrapper[7508]: I0313 10:39:47.703392 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:39:48.476881 master-0 kubenswrapper[7508]: E0313 10:39:48.476739 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:39:48.762788 master-0 kubenswrapper[7508]: E0313 10:39:48.762525 7508 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{community-operators-lhqzl.189c6050c5efce5f openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:community-operators-lhqzl,UID:8b07c5ae-1149-4031-bd92-6df4331e586c,APIVersion:v1,ResourceVersion:9509,FieldPath:spec.initContainers{extract-utilities},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:37:33.311610463 +0000 UTC m=+92.054435580,LastTimestamp:2026-03-13 10:37:33.311610463 +0000 UTC m=+92.054435580,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:39:51.557550 master-0 kubenswrapper[7508]: I0313 10:39:51.557451 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:39:51.558229 master-0 kubenswrapper[7508]: I0313 10:39:51.557556 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:39:54.561704 master-0 kubenswrapper[7508]: I0313 10:39:54.561509 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:54.562815 master-0 kubenswrapper[7508]: I0313 10:39:54.561741 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:54.563201 master-0 kubenswrapper[7508]: I0313 10:39:54.563140 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Liveness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:39:54.563488 master-0 kubenswrapper[7508]: I0313 10:39:54.563437 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:39:55.101323 master-0 kubenswrapper[7508]: E0313 10:39:55.101185 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": the server was unable to return a response in the time allotted, but may still be processing the request (get nodes master-0)" Mar 13 10:39:55.101323 master-0 kubenswrapper[7508]: E0313 10:39:55.101280 7508 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 10:39:55.959892 master-0 kubenswrapper[7508]: I0313 10:39:55.959801 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:39:55.961085 master-0 kubenswrapper[7508]: I0313 10:39:55.959950 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:39:55.961085 master-0 kubenswrapper[7508]: I0313 10:39:55.960214 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:39:55.961085 master-0 kubenswrapper[7508]: I0313 10:39:55.960301 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:39:56.045584 master-0 kubenswrapper[7508]: I0313 10:39:56.045492 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:39:56.045913 master-0 kubenswrapper[7508]: I0313 10:39:56.045584 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:39:56.045913 master-0 kubenswrapper[7508]: I0313 10:39:56.045613 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:39:56.045913 master-0 kubenswrapper[7508]: I0313 10:39:56.045666 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:39:57.396199 master-0 kubenswrapper[7508]: I0313 10:39:57.395987 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:39:57.397330 master-0 kubenswrapper[7508]: I0313 10:39:57.396291 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:39:57.397330 master-0 kubenswrapper[7508]: I0313 10:39:57.396280 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:39:57.397330 master-0 kubenswrapper[7508]: I0313 10:39:57.396405 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:39:58.095307 master-0 kubenswrapper[7508]: I0313 10:39:58.095240 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c834b554-c652-4f45-9110-3d4e260ba98a/installer/0.log" Mar 13 10:39:58.095619 master-0 kubenswrapper[7508]: I0313 10:39:58.095304 7508 generic.go:334] "Generic (PLEG): container finished" podID="c834b554-c652-4f45-9110-3d4e260ba98a" containerID="bda6d571a69475cffe984e819a7cc51ddb710348cfb7bd2636c19986e3e1d5ca" exitCode=1 Mar 13 10:39:58.096977 master-0 kubenswrapper[7508]: I0313 10:39:58.096948 7508 generic.go:334] "Generic (PLEG): container finished" podID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerID="34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104" exitCode=0 Mar 13 10:40:00.701146 master-0 kubenswrapper[7508]: E0313 10:40:00.701035 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:01.557703 master-0 kubenswrapper[7508]: I0313 10:40:01.557609 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:40:01.558204 master-0 kubenswrapper[7508]: I0313 10:40:01.557713 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:40:04.569305 master-0 kubenswrapper[7508]: I0313 10:40:04.569185 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:40:04.569305 master-0 kubenswrapper[7508]: I0313 10:40:04.569268 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:40:05.478787 master-0 kubenswrapper[7508]: E0313 10:40:05.478547 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:40:05.960047 master-0 kubenswrapper[7508]: I0313 10:40:05.959925 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:40:05.961139 master-0 kubenswrapper[7508]: I0313 10:40:05.960074 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:40:06.046033 master-0 kubenswrapper[7508]: I0313 10:40:06.045911 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:40:06.046033 master-0 kubenswrapper[7508]: I0313 10:40:06.046009 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:40:06.743736 master-0 kubenswrapper[7508]: I0313 10:40:06.743622 7508 status_manager.go:851] "Failed to get status for pod" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" pod="openshift-kube-apiserver/installer-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-master-0)" Mar 13 10:40:07.396333 master-0 kubenswrapper[7508]: I0313 10:40:07.396211 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:40:07.396333 master-0 kubenswrapper[7508]: I0313 10:40:07.396280 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:40:07.396333 master-0 kubenswrapper[7508]: I0313 10:40:07.396321 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:40:07.397018 master-0 kubenswrapper[7508]: I0313 10:40:07.396357 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:40:11.557809 master-0 kubenswrapper[7508]: I0313 10:40:11.557727 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:40:11.559075 master-0 kubenswrapper[7508]: I0313 10:40:11.557820 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:40:14.561843 master-0 kubenswrapper[7508]: I0313 10:40:14.561736 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:40:14.562815 master-0 kubenswrapper[7508]: I0313 10:40:14.561863 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:40:15.067748 master-0 kubenswrapper[7508]: E0313 10:40:15.067659 7508 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06ecac2e_bffa_474b_a824_9ba4a194159a.slice/crio-conmon-77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490.scope\": RecentStats: unable to find data in memory cache]" Mar 13 10:40:15.123781 master-0 kubenswrapper[7508]: E0313 10:40:15.123546 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:40:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:40:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:40:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:40:05Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8c978bb5c329452b181f61f00452b4c2bfd83d245db56050bc7607972a791a76\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:e6567accc084db971e077b5ca666357e3a326fa27f69fc7135a5bc2e19f998eb\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221745369},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e9ee63a30a9b95b5801afa36e09fc583ec2cda3c5cb3c8676e478fea016abfa1\\\"],\\\"sizeBytes\\\":470680779},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:15.229761 master-0 kubenswrapper[7508]: I0313 10:40:15.229562 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/0.log" Mar 13 10:40:15.229761 master-0 kubenswrapper[7508]: I0313 10:40:15.229639 7508 generic.go:334] "Generic (PLEG): container finished" podID="06ecac2e-bffa-474b-a824-9ba4a194159a" containerID="77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490" exitCode=1 Mar 13 10:40:15.960361 master-0 kubenswrapper[7508]: I0313 10:40:15.960267 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Liveness probe status=failure output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:40:15.960925 master-0 kubenswrapper[7508]: I0313 10:40:15.960357 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:40:15.960925 master-0 kubenswrapper[7508]: I0313 10:40:15.960387 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/healthz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:40:15.960925 master-0 kubenswrapper[7508]: I0313 10:40:15.960453 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:40:16.045242 master-0 kubenswrapper[7508]: I0313 10:40:16.045086 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:40:16.045529 master-0 kubenswrapper[7508]: I0313 10:40:16.045253 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:40:16.045529 master-0 kubenswrapper[7508]: I0313 10:40:16.045082 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Liveness probe status=failure output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:40:16.045529 master-0 kubenswrapper[7508]: I0313 10:40:16.045401 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/healthz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:40:16.243249 master-0 kubenswrapper[7508]: I0313 10:40:16.243084 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-pbxm8_1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c/machine-approver-controller/0.log" Mar 13 10:40:16.243566 master-0 kubenswrapper[7508]: I0313 10:40:16.243530 7508 generic.go:334] "Generic (PLEG): container finished" podID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" exitCode=255 Mar 13 10:40:17.396514 master-0 kubenswrapper[7508]: I0313 10:40:17.396283 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:40:17.396514 master-0 kubenswrapper[7508]: I0313 10:40:17.396360 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Liveness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:40:17.396514 master-0 kubenswrapper[7508]: I0313 10:40:17.396394 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:40:17.396514 master-0 kubenswrapper[7508]: I0313 10:40:17.396439 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:40:20.276898 master-0 kubenswrapper[7508]: I0313 10:40:20.276820 7508 generic.go:334] "Generic (PLEG): container finished" podID="193b3b95-f9a3-4272-853b-86366ce348a2" containerID="ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac" exitCode=0 Mar 13 10:40:21.558144 master-0 kubenswrapper[7508]: I0313 10:40:21.558028 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" start-of-body= Mar 13 10:40:21.559036 master-0 kubenswrapper[7508]: I0313 10:40:21.558219 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": dial tcp 10.128.0.68:5443: connect: connection refused" Mar 13 10:40:21.706782 master-0 kubenswrapper[7508]: E0313 10:40:21.706697 7508 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 13 10:40:21.707033 master-0 kubenswrapper[7508]: E0313 10:40:21.707008 7508 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.012s" Mar 13 10:40:21.707127 master-0 kubenswrapper[7508]: I0313 10:40:21.707052 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerStarted","Data":"292bc64fae325e305791874ac3c6df238e90679ca812b4e7ab3bdd42cad6e68f"} Mar 13 10:40:21.707311 master-0 kubenswrapper[7508]: I0313 10:40:21.707280 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:40:21.707311 master-0 kubenswrapper[7508]: I0313 10:40:21.707310 7508 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:40:21.709227 master-0 kubenswrapper[7508]: I0313 10:40:21.709187 7508 scope.go:117] "RemoveContainer" containerID="c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb" Mar 13 10:40:21.716628 master-0 kubenswrapper[7508]: I0313 10:40:21.716569 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:40:22.293866 master-0 kubenswrapper[7508]: I0313 10:40:22.293775 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-85b658d7fb-45fq6_97328e01-1227-417e-9af7-6426495d96db/packageserver/0.log" Mar 13 10:40:22.480270 master-0 kubenswrapper[7508]: E0313 10:40:22.479896 7508 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 13 10:40:22.766461 master-0 kubenswrapper[7508]: E0313 10:40:22.766191 7508 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{redhat-marketplace-dnhzw.189c6050c5f17e9c openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:redhat-marketplace-dnhzw,UID:f99b999c-4213-4d29-ab14-26c584e88445,APIVersion:v1,ResourceVersion:9531,FieldPath:spec.initContainers{extract-utilities},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:37:33.311721116 +0000 UTC m=+92.054546233,LastTimestamp:2026-03-13 10:37:33.311721116 +0000 UTC m=+92.054546233,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:40:24.561410 master-0 kubenswrapper[7508]: I0313 10:40:24.561346 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:40:24.562562 master-0 kubenswrapper[7508]: I0313 10:40:24.561413 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:40:25.124072 master-0 kubenswrapper[7508]: E0313 10:40:25.123982 7508 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:25.959547 master-0 kubenswrapper[7508]: I0313 10:40:25.959452 7508 patch_prober.go:28] interesting pod/operator-controller-controller-manager-6598bfb6c4-22jb5 container/manager namespace/openshift-operator-controller: Readiness probe status=failure output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" start-of-body= Mar 13 10:40:25.959547 master-0 kubenswrapper[7508]: I0313 10:40:25.959537 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" podUID="ec33c506-8abe-4659-84d3-a294c31b446c" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.35:8081/readyz\": dial tcp 10.128.0.35:8081: connect: connection refused" Mar 13 10:40:26.045460 master-0 kubenswrapper[7508]: I0313 10:40:26.045346 7508 patch_prober.go:28] interesting pod/catalogd-controller-manager-7f8b8b6f4c-657wt container/manager namespace/openshift-catalogd: Readiness probe status=failure output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" start-of-body= Mar 13 10:40:26.045460 master-0 kubenswrapper[7508]: I0313 10:40:26.045445 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" podUID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerName="manager" probeResult="failure" output="Get \"http://10.128.0.36:8081/readyz\": dial tcp 10.128.0.36:8081: connect: connection refused" Mar 13 10:40:27.396048 master-0 kubenswrapper[7508]: I0313 10:40:27.395913 7508 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:40:27.396965 master-0 kubenswrapper[7508]: I0313 10:40:27.396063 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:40:32.241573 master-0 kubenswrapper[7508]: E0313 10:40:32.241397 7508 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="10.534s" Mar 13 10:40:32.242689 master-0 kubenswrapper[7508]: I0313 10:40:32.241842 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"eaaaf7cee366d112a249fdfea1e9161302183d264a0f34f87ad0c3717abfbc0b"} Mar 13 10:40:32.243305 master-0 kubenswrapper[7508]: I0313 10:40:32.243227 7508 scope.go:117] "RemoveContainer" containerID="34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104" Mar 13 10:40:32.244303 master-0 kubenswrapper[7508]: I0313 10:40:32.244238 7508 scope.go:117] "RemoveContainer" containerID="ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac" Mar 13 10:40:32.244472 master-0 kubenswrapper[7508]: I0313 10:40:32.244328 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:40:32.244608 master-0 kubenswrapper[7508]: I0313 10:40:32.244550 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:40:32.244608 master-0 kubenswrapper[7508]: I0313 10:40:32.244601 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:40:32.244849 master-0 kubenswrapper[7508]: I0313 10:40:32.244612 7508 scope.go:117] "RemoveContainer" containerID="eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946" Mar 13 10:40:32.244849 master-0 kubenswrapper[7508]: I0313 10:40:32.244623 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerStarted","Data":"f2f35061d66ce08b758ee386196ed6ff6b4759bf3ce064d800ee6dab38937e10"} Mar 13 10:40:32.244849 master-0 kubenswrapper[7508]: I0313 10:40:32.244823 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerStarted","Data":"37af611a75465718656693a5e1606817c7f3876bc4578fbedfae2376aafb266a"} Mar 13 10:40:32.246407 master-0 kubenswrapper[7508]: I0313 10:40:32.244944 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:40:32.246407 master-0 kubenswrapper[7508]: I0313 10:40:32.244970 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"ad75c939343bfb30bc5319b14b8035776ee4b1b3343e77f1374907643eae75c7"} Mar 13 10:40:32.246407 master-0 kubenswrapper[7508]: I0313 10:40:32.244990 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerDied","Data":"d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248"} Mar 13 10:40:32.246407 master-0 kubenswrapper[7508]: I0313 10:40:32.245511 7508 scope.go:117] "RemoveContainer" containerID="84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c" Mar 13 10:40:32.246407 master-0 kubenswrapper[7508]: I0313 10:40:32.245791 7508 scope.go:117] "RemoveContainer" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" Mar 13 10:40:32.247791 master-0 kubenswrapper[7508]: I0313 10:40:32.246745 7508 scope.go:117] "RemoveContainer" containerID="1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258" Mar 13 10:40:32.273179 master-0 kubenswrapper[7508]: I0313 10:40:32.270981 7508 scope.go:117] "RemoveContainer" containerID="acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e" Mar 13 10:40:32.288817 master-0 kubenswrapper[7508]: I0313 10:40:32.288754 7508 scope.go:117] "RemoveContainer" containerID="6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea" Mar 13 10:40:32.292166 master-0 kubenswrapper[7508]: I0313 10:40:32.290835 7508 scope.go:117] "RemoveContainer" containerID="a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f" Mar 13 10:40:32.297140 master-0 kubenswrapper[7508]: I0313 10:40:32.293972 7508 scope.go:117] "RemoveContainer" containerID="77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490" Mar 13 10:40:32.300161 master-0 kubenswrapper[7508]: I0313 10:40:32.299511 7508 scope.go:117] "RemoveContainer" containerID="65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e" Mar 13 10:40:32.300930 master-0 kubenswrapper[7508]: I0313 10:40:32.300817 7508 scope.go:117] "RemoveContainer" containerID="d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829" Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.303766 7508 scope.go:117] "RemoveContainer" containerID="7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe" Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.304780 7508 scope.go:117] "RemoveContainer" containerID="d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248" Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.305019 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.305049 7508 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.305064 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9"} Mar 13 10:40:32.307149 master-0 kubenswrapper[7508]: I0313 10:40:32.305085 7508 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:40:32.312145 master-0 kubenswrapper[7508]: I0313 10:40:32.309491 7508 scope.go:117] "RemoveContainer" containerID="1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748" Mar 13 10:40:32.332162 master-0 kubenswrapper[7508]: I0313 10:40:32.330184 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351612 7508 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351649 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351661 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351684 7508 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="14334617-f367-40b1-852b-8e59a9f689f3" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351703 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351712 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerDied","Data":"1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351727 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerDied","Data":"d19b978c1e8101a0212df3b6611d9d31aa1e8b34d80df670a9b5c7dd94abdbf2"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351743 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351752 7508 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="14334617-f367-40b1-852b-8e59a9f689f3" Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351770 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerDied","Data":"c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351782 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerDied","Data":"d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351794 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerDied","Data":"acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351805 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351825 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351835 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerDied","Data":"1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351853 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerDied","Data":"6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351863 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerDied","Data":"a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351874 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerDied","Data":"84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351884 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerDied","Data":"eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351894 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351903 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerStarted","Data":"215427705b781af8c7a6f0bf3e652f4e47b2031bd0151f282db01d4307872ee6"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351913 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"6ba9e9be1786a23e8c36df67db33e0578535dc45660f08e0b4a15c0971863075"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351930 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerDied","Data":"bda6d571a69475cffe984e819a7cc51ddb710348cfb7bd2636c19986e3e1d5ca"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351941 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerDied","Data":"34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351956 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351970 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351981 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.351993 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.352001 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.352009 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerDied","Data":"77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.352018 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerDied","Data":"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.352030 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerDied","Data":"ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac"} Mar 13 10:40:32.352176 master-0 kubenswrapper[7508]: I0313 10:40:32.352051 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerStarted","Data":"f8ee70c9fa0ff679b2fe8d381e882bf591af1eea80403c4097b9987e2d06b36d"} Mar 13 10:40:32.356136 master-0 kubenswrapper[7508]: I0313 10:40:32.354016 7508 scope.go:117] "RemoveContainer" containerID="476341e9a176df7914ed42068e9cb3e621e16d05240f26c7f1a1bd7339384984" Mar 13 10:40:32.356136 master-0 kubenswrapper[7508]: I0313 10:40:32.354451 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:40:32.387145 master-0 kubenswrapper[7508]: I0313 10:40:32.385441 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" podStartSLOduration=192.385407689 podStartE2EDuration="3m12.385407689s" podCreationTimestamp="2026-03-13 10:37:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:40:32.384692708 +0000 UTC m=+271.127517825" watchObservedRunningTime="2026-03-13 10:40:32.385407689 +0000 UTC m=+271.128232806" Mar 13 10:40:32.534554 master-0 kubenswrapper[7508]: I0313 10:40:32.534518 7508 scope.go:117] "RemoveContainer" containerID="c06a4f7f54577d80872f3a5157b329f2c2ec17e43e599b09564a82e127162989" Mar 13 10:40:32.752275 master-0 kubenswrapper[7508]: I0313 10:40:32.752208 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" podStartSLOduration=160.978446066 podStartE2EDuration="3m19.752181667s" podCreationTimestamp="2026-03-13 10:37:13 +0000 UTC" firstStartedPulling="2026-03-13 10:37:15.574558586 +0000 UTC m=+74.317383693" lastFinishedPulling="2026-03-13 10:37:54.348294177 +0000 UTC m=+113.091119294" observedRunningTime="2026-03-13 10:40:32.750970001 +0000 UTC m=+271.493795128" watchObservedRunningTime="2026-03-13 10:40:32.752181667 +0000 UTC m=+271.495006784" Mar 13 10:40:32.845595 master-0 kubenswrapper[7508]: I0313 10:40:32.839784 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-kwwkz" podStartSLOduration=174.094192107 podStartE2EDuration="3m12.839757378s" podCreationTimestamp="2026-03-13 10:37:20 +0000 UTC" firstStartedPulling="2026-03-13 10:37:56.85004978 +0000 UTC m=+115.592874887" lastFinishedPulling="2026-03-13 10:38:15.595615041 +0000 UTC m=+134.338440158" observedRunningTime="2026-03-13 10:40:32.839464699 +0000 UTC m=+271.582289816" watchObservedRunningTime="2026-03-13 10:40:32.839757378 +0000 UTC m=+271.582582495" Mar 13 10:40:32.989342 master-0 kubenswrapper[7508]: I0313 10:40:32.989298 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_1e86a3b0-37b3-4df1-a522-f29cda076753/installer/0.log" Mar 13 10:40:32.989790 master-0 kubenswrapper[7508]: I0313 10:40:32.989393 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:40:32.989790 master-0 kubenswrapper[7508]: I0313 10:40:32.989768 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c834b554-c652-4f45-9110-3d4e260ba98a/installer/0.log" Mar 13 10:40:32.989855 master-0 kubenswrapper[7508]: I0313 10:40:32.989813 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:40:33.004981 master-0 kubenswrapper[7508]: I0313 10:40:33.004263 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:40:33.031491 master-0 kubenswrapper[7508]: I0313 10:40:33.031439 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-29dk6"] Mar 13 10:40:33.066871 master-0 kubenswrapper[7508]: I0313 10:40:33.066793 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.078854 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock\") pod \"1e86a3b0-37b3-4df1-a522-f29cda076753\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.078943 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir\") pod \"1e86a3b0-37b3-4df1-a522-f29cda076753\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.078973 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir\") pod \"c834b554-c652-4f45-9110-3d4e260ba98a\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.078996 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access\") pod \"1e86a3b0-37b3-4df1-a522-f29cda076753\" (UID: \"1e86a3b0-37b3-4df1-a522-f29cda076753\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079040 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access\") pod \"c834b554-c652-4f45-9110-3d4e260ba98a\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079058 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock\") pod \"c834b554-c652-4f45-9110-3d4e260ba98a\" (UID: \"c834b554-c652-4f45-9110-3d4e260ba98a\") " Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079305 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock" (OuterVolumeSpecName: "var-lock") pod "c834b554-c652-4f45-9110-3d4e260ba98a" (UID: "c834b554-c652-4f45-9110-3d4e260ba98a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079343 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock" (OuterVolumeSpecName: "var-lock") pod "1e86a3b0-37b3-4df1-a522-f29cda076753" (UID: "1e86a3b0-37b3-4df1-a522-f29cda076753"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079359 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "1e86a3b0-37b3-4df1-a522-f29cda076753" (UID: "1e86a3b0-37b3-4df1-a522-f29cda076753"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:40:33.080205 master-0 kubenswrapper[7508]: I0313 10:40:33.079373 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c834b554-c652-4f45-9110-3d4e260ba98a" (UID: "c834b554-c652-4f45-9110-3d4e260ba98a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:40:33.083871 master-0 kubenswrapper[7508]: I0313 10:40:33.078101 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-dbhll"] Mar 13 10:40:33.113211 master-0 kubenswrapper[7508]: I0313 10:40:33.111024 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1e86a3b0-37b3-4df1-a522-f29cda076753" (UID: "1e86a3b0-37b3-4df1-a522-f29cda076753"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:40:33.113211 master-0 kubenswrapper[7508]: I0313 10:40:33.111046 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c834b554-c652-4f45-9110-3d4e260ba98a" (UID: "c834b554-c652-4f45-9110-3d4e260ba98a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:40:33.172393 master-0 kubenswrapper[7508]: I0313 10:40:33.172275 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" podStartSLOduration=173.713109646 podStartE2EDuration="3m13.17223769s" podCreationTimestamp="2026-03-13 10:37:20 +0000 UTC" firstStartedPulling="2026-03-13 10:37:55.98751727 +0000 UTC m=+114.730342387" lastFinishedPulling="2026-03-13 10:38:15.446645314 +0000 UTC m=+134.189470431" observedRunningTime="2026-03-13 10:40:33.163299398 +0000 UTC m=+271.906124515" watchObservedRunningTime="2026-03-13 10:40:33.17223769 +0000 UTC m=+271.915062817" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180755 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180793 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1e86a3b0-37b3-4df1-a522-f29cda076753-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180802 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c834b554-c652-4f45-9110-3d4e260ba98a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180810 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c834b554-c652-4f45-9110-3d4e260ba98a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180821 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.180813 master-0 kubenswrapper[7508]: I0313 10:40:33.180830 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1e86a3b0-37b3-4df1-a522-f29cda076753-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:40:33.238199 master-0 kubenswrapper[7508]: I0313 10:40:33.238134 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:40:33.251138 master-0 kubenswrapper[7508]: I0313 10:40:33.245273 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-cbmz8"] Mar 13 10:40:33.278214 master-0 kubenswrapper[7508]: I0313 10:40:33.276095 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:40:33.297124 master-0 kubenswrapper[7508]: I0313 10:40:33.290327 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-wlpwf"] Mar 13 10:40:33.322151 master-0 kubenswrapper[7508]: I0313 10:40:33.314193 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" podStartSLOduration=164.84906775 podStartE2EDuration="3m14.314174408s" podCreationTimestamp="2026-03-13 10:37:19 +0000 UTC" firstStartedPulling="2026-03-13 10:37:24.893275311 +0000 UTC m=+83.636100428" lastFinishedPulling="2026-03-13 10:37:54.358381969 +0000 UTC m=+113.101207086" observedRunningTime="2026-03-13 10:40:33.31253937 +0000 UTC m=+272.055364487" watchObservedRunningTime="2026-03-13 10:40:33.314174408 +0000 UTC m=+272.056999525" Mar 13 10:40:33.357138 master-0 kubenswrapper[7508]: I0313 10:40:33.353236 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:40:33.357138 master-0 kubenswrapper[7508]: I0313 10:40:33.353318 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:33.361131 master-0 kubenswrapper[7508]: I0313 10:40:33.358087 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-kqrsd" podStartSLOduration=152.535946885 podStartE2EDuration="3m14.358065246s" podCreationTimestamp="2026-03-13 10:37:19 +0000 UTC" firstStartedPulling="2026-03-13 10:37:33.651887838 +0000 UTC m=+92.394712955" lastFinishedPulling="2026-03-13 10:38:15.474006199 +0000 UTC m=+134.216831316" observedRunningTime="2026-03-13 10:40:33.336448672 +0000 UTC m=+272.079273789" watchObservedRunningTime="2026-03-13 10:40:33.358065246 +0000 UTC m=+272.100890363" Mar 13 10:40:33.361131 master-0 kubenswrapper[7508]: I0313 10:40:33.358811 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podStartSLOduration=192.358805998 podStartE2EDuration="3m12.358805998s" podCreationTimestamp="2026-03-13 10:37:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:40:33.355210573 +0000 UTC m=+272.098035690" watchObservedRunningTime="2026-03-13 10:40:33.358805998 +0000 UTC m=+272.101631115" Mar 13 10:40:33.378138 master-0 kubenswrapper[7508]: I0313 10:40:33.377784 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/0.log" Mar 13 10:40:33.378138 master-0 kubenswrapper[7508]: I0313 10:40:33.378141 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"c14288f5668e235056cc67c66c8553579053cff3b8159a0ec2c339bf75712609"} Mar 13 10:40:33.382135 master-0 kubenswrapper[7508]: I0313 10:40:33.378848 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:40:33.382135 master-0 kubenswrapper[7508]: I0313 10:40:33.380417 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c834b554-c652-4f45-9110-3d4e260ba98a/installer/0.log" Mar 13 10:40:33.382135 master-0 kubenswrapper[7508]: I0313 10:40:33.380469 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerDied","Data":"14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495"} Mar 13 10:40:33.382135 master-0 kubenswrapper[7508]: I0313 10:40:33.380495 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495" Mar 13 10:40:33.382135 master-0 kubenswrapper[7508]: I0313 10:40:33.380540 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:40:33.383037 master-0 kubenswrapper[7508]: I0313 10:40:33.382723 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/0.log" Mar 13 10:40:33.383125 master-0 kubenswrapper[7508]: I0313 10:40:33.383053 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"4306aa93623283fa1e756de36acf9fe639a1c8b92b5741ac2b1dc315689b3cc6"} Mar 13 10:40:33.384510 master-0 kubenswrapper[7508]: I0313 10:40:33.384484 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"7f44cac9d59c9752582d0c710ae74baa24a3adcc9cd398ea6e5fd9c8a59527e5"} Mar 13 10:40:33.385137 master-0 kubenswrapper[7508]: I0313 10:40:33.385098 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:40:33.390367 master-0 kubenswrapper[7508]: I0313 10:40:33.390327 7508 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-4v99n container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" start-of-body= Mar 13 10:40:33.390476 master-0 kubenswrapper[7508]: I0313 10:40:33.390368 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" podUID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.20:8080/healthz\": dial tcp 10.128.0.20:8080: connect: connection refused" Mar 13 10:40:33.400540 master-0 kubenswrapper[7508]: I0313 10:40:33.399801 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-z9wrg_8d2fdba3-9478-4165-9207-d01483625607/network-operator/1.log" Mar 13 10:40:33.400540 master-0 kubenswrapper[7508]: I0313 10:40:33.399869 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"f92b7dcf30e2a83f947525493e88745aa9417da1536fbf60b66ed4a133a0e4a5"} Mar 13 10:40:33.405925 master-0 kubenswrapper[7508]: I0313 10:40:33.405887 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-b2ss8_cf740515-d70d-44b6-ac00-21143b5494d1/ingress-operator/0.log" Mar 13 10:40:33.406004 master-0 kubenswrapper[7508]: I0313 10:40:33.405937 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"443c931f2ebac98a3b89766ad47f2b9a07d8226240bc2a88a99655cd8cc10093"} Mar 13 10:40:33.408402 master-0 kubenswrapper[7508]: I0313 10:40:33.408376 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"b87c048ad8f6b66600aef035430a3c74694d425a7990645314c96636905e37f6"} Mar 13 10:40:33.410714 master-0 kubenswrapper[7508]: I0313 10:40:33.410670 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/0.log" Mar 13 10:40:33.410774 master-0 kubenswrapper[7508]: I0313 10:40:33.410735 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82"} Mar 13 10:40:33.421529 master-0 kubenswrapper[7508]: I0313 10:40:33.421453 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerStarted","Data":"6bccc03f527d31faff90a6a48a17616821689e95166564e5cd0c7e71c9851946"} Mar 13 10:40:33.423093 master-0 kubenswrapper[7508]: I0313 10:40:33.423060 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/0.log" Mar 13 10:40:33.423157 master-0 kubenswrapper[7508]: I0313 10:40:33.423124 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2"} Mar 13 10:40:33.425486 master-0 kubenswrapper[7508]: I0313 10:40:33.424898 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69"} Mar 13 10:40:33.426230 master-0 kubenswrapper[7508]: I0313 10:40:33.426204 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_1e86a3b0-37b3-4df1-a522-f29cda076753/installer/0.log" Mar 13 10:40:33.426280 master-0 kubenswrapper[7508]: I0313 10:40:33.426255 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerDied","Data":"cc178eff65e9e37dfca64d7638a02200669b20cdded82a2b29fd98ec8a15cc9e"} Mar 13 10:40:33.426280 master-0 kubenswrapper[7508]: I0313 10:40:33.426270 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc178eff65e9e37dfca64d7638a02200669b20cdded82a2b29fd98ec8a15cc9e" Mar 13 10:40:33.426371 master-0 kubenswrapper[7508]: I0313 10:40:33.426310 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:40:33.431593 master-0 kubenswrapper[7508]: I0313 10:40:33.431566 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-pbxm8_1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c/machine-approver-controller/0.log" Mar 13 10:40:33.432585 master-0 kubenswrapper[7508]: I0313 10:40:33.432254 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerStarted","Data":"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0"} Mar 13 10:40:33.440572 master-0 kubenswrapper[7508]: I0313 10:40:33.440339 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerStarted","Data":"54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621"} Mar 13 10:40:33.440572 master-0 kubenswrapper[7508]: I0313 10:40:33.440416 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:40:33.447269 master-0 kubenswrapper[7508]: I0313 10:40:33.446165 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:40:33.447269 master-0 kubenswrapper[7508]: I0313 10:40:33.446419 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/0.log" Mar 13 10:40:33.447269 master-0 kubenswrapper[7508]: I0313 10:40:33.446470 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"406d6e11697cacd57dcd99d84785c736a52ac48c6ef5c27b81e728ae6e2f38f1"} Mar 13 10:40:33.450324 master-0 kubenswrapper[7508]: I0313 10:40:33.449700 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/0.log" Mar 13 10:40:33.450324 master-0 kubenswrapper[7508]: I0313 10:40:33.449753 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"b6607de7f8444878291cce041e89b284e3fdfa07de1c40770b98ee1612cc8d65"} Mar 13 10:40:33.450324 master-0 kubenswrapper[7508]: I0313 10:40:33.450289 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:40:33.452301 master-0 kubenswrapper[7508]: I0313 10:40:33.452015 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-wz9t2_b57f1c19-f44a-4405-8135-79aef1d1ce07/cluster-storage-operator/0.log" Mar 13 10:40:33.453519 master-0 kubenswrapper[7508]: I0313 10:40:33.452936 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"8d2502ddf45dc60246cfc038c25340d355c40feb7ef15264d33e1c93664efbd3"} Mar 13 10:40:33.462313 master-0 kubenswrapper[7508]: I0313 10:40:33.462206 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" podStartSLOduration=194.349403888 podStartE2EDuration="3m17.462140602s" podCreationTimestamp="2026-03-13 10:37:16 +0000 UTC" firstStartedPulling="2026-03-13 10:37:56.101496637 +0000 UTC m=+114.844321754" lastFinishedPulling="2026-03-13 10:37:59.214233351 +0000 UTC m=+117.957058468" observedRunningTime="2026-03-13 10:40:33.458556547 +0000 UTC m=+272.201381664" watchObservedRunningTime="2026-03-13 10:40:33.462140602 +0000 UTC m=+272.204965719" Mar 13 10:40:33.477870 master-0 kubenswrapper[7508]: I0313 10:40:33.477786 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-lhqzl" podStartSLOduration=172.893351137 podStartE2EDuration="3m11.477763201s" podCreationTimestamp="2026-03-13 10:37:22 +0000 UTC" firstStartedPulling="2026-03-13 10:37:57.101141935 +0000 UTC m=+115.843967052" lastFinishedPulling="2026-03-13 10:38:15.685553989 +0000 UTC m=+134.428379116" observedRunningTime="2026-03-13 10:40:33.47568223 +0000 UTC m=+272.218507357" watchObservedRunningTime="2026-03-13 10:40:33.477763201 +0000 UTC m=+272.220588318" Mar 13 10:40:33.535135 master-0 kubenswrapper[7508]: I0313 10:40:33.534966 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" path="/var/lib/kubelet/pods/1b072636-e46b-47f6-af85-3210e62bbd2d/volumes" Mar 13 10:40:33.542181 master-0 kubenswrapper[7508]: I0313 10:40:33.542101 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" path="/var/lib/kubelet/pods/1b57fa2d-b65e-4c69-97ce-4a379470d2de/volumes" Mar 13 10:40:33.542951 master-0 kubenswrapper[7508]: I0313 10:40:33.542921 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" path="/var/lib/kubelet/pods/20df9416-90f4-4c21-a3bc-c6e5f6622e15/volumes" Mar 13 10:40:33.543651 master-0 kubenswrapper[7508]: I0313 10:40:33.543602 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" path="/var/lib/kubelet/pods/61b83fd7-2b78-42a9-9d93-0be3fd59a679/volumes" Mar 13 10:40:33.646366 master-0 kubenswrapper[7508]: I0313 10:40:33.643907 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" podStartSLOduration=168.47931439 podStartE2EDuration="3m14.643890939s" podCreationTimestamp="2026-03-13 10:37:19 +0000 UTC" firstStartedPulling="2026-03-13 10:37:29.571734582 +0000 UTC m=+88.314559689" lastFinishedPulling="2026-03-13 10:37:55.736311121 +0000 UTC m=+114.479136238" observedRunningTime="2026-03-13 10:40:33.643058814 +0000 UTC m=+272.385883931" watchObservedRunningTime="2026-03-13 10:40:33.643890939 +0000 UTC m=+272.386716056" Mar 13 10:40:33.771026 master-0 kubenswrapper[7508]: I0313 10:40:33.770932 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:40:33.776415 master-0 kubenswrapper[7508]: I0313 10:40:33.776363 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 13 10:40:33.792225 master-0 kubenswrapper[7508]: I0313 10:40:33.791218 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-dnhzw" podStartSLOduration=173.281363026 podStartE2EDuration="3m11.791200184s" podCreationTimestamp="2026-03-13 10:37:22 +0000 UTC" firstStartedPulling="2026-03-13 10:37:56.974712689 +0000 UTC m=+115.717537806" lastFinishedPulling="2026-03-13 10:38:15.484549807 +0000 UTC m=+134.227374964" observedRunningTime="2026-03-13 10:40:33.790863954 +0000 UTC m=+272.533689081" watchObservedRunningTime="2026-03-13 10:40:33.791200184 +0000 UTC m=+272.534025301" Mar 13 10:40:33.991460 master-0 kubenswrapper[7508]: I0313 10:40:33.991310 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" podStartSLOduration=168.391329258 podStartE2EDuration="3m14.991275498s" podCreationTimestamp="2026-03-13 10:37:19 +0000 UTC" firstStartedPulling="2026-03-13 10:37:29.145734681 +0000 UTC m=+87.888559838" lastFinishedPulling="2026-03-13 10:37:55.745680941 +0000 UTC m=+114.488506078" observedRunningTime="2026-03-13 10:40:33.990565378 +0000 UTC m=+272.733390575" watchObservedRunningTime="2026-03-13 10:40:33.991275498 +0000 UTC m=+272.734100645" Mar 13 10:40:34.050916 master-0 kubenswrapper[7508]: I0313 10:40:34.050527 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" podStartSLOduration=166.500839988 podStartE2EDuration="3m15.050498337s" podCreationTimestamp="2026-03-13 10:37:19 +0000 UTC" firstStartedPulling="2026-03-13 10:37:27.18533079 +0000 UTC m=+85.928155897" lastFinishedPulling="2026-03-13 10:37:55.734989129 +0000 UTC m=+114.477814246" observedRunningTime="2026-03-13 10:40:34.046031726 +0000 UTC m=+272.788856853" watchObservedRunningTime="2026-03-13 10:40:34.050498337 +0000 UTC m=+272.793323494" Mar 13 10:40:34.183244 master-0 kubenswrapper[7508]: I0313 10:40:34.183076 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" podStartSLOduration=166.950404531 podStartE2EDuration="3m18.183045199s" podCreationTimestamp="2026-03-13 10:37:16 +0000 UTC" firstStartedPulling="2026-03-13 10:37:22.510663689 +0000 UTC m=+81.253488806" lastFinishedPulling="2026-03-13 10:37:53.743304357 +0000 UTC m=+112.486129474" observedRunningTime="2026-03-13 10:40:34.177385373 +0000 UTC m=+272.920210510" watchObservedRunningTime="2026-03-13 10:40:34.183045199 +0000 UTC m=+272.925870356" Mar 13 10:40:34.354049 master-0 kubenswrapper[7508]: I0313 10:40:34.353797 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:40:34.354962 master-0 kubenswrapper[7508]: I0313 10:40:34.353932 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:34.468078 master-0 kubenswrapper[7508]: I0313 10:40:34.467876 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:40:35.508850 master-0 kubenswrapper[7508]: I0313 10:40:35.508758 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="046ee36d-4062-4c48-bab0-57381613b2ad" path="/var/lib/kubelet/pods/046ee36d-4062-4c48-bab0-57381613b2ad/volumes" Mar 13 10:40:35.806870 master-0 kubenswrapper[7508]: I0313 10:40:35.806636 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:35.806870 master-0 kubenswrapper[7508]: I0313 10:40:35.806740 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:35.840052 master-0 kubenswrapper[7508]: I0313 10:40:35.839844 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:36.304831 master-0 kubenswrapper[7508]: I0313 10:40:36.304682 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:40:36.500429 master-0 kubenswrapper[7508]: I0313 10:40:36.500350 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:36.501795 master-0 kubenswrapper[7508]: E0313 10:40:36.501671 7508 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 13 10:40:36.553899 master-0 kubenswrapper[7508]: I0313 10:40:36.553707 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=0.553672284 podStartE2EDuration="553.672284ms" podCreationTimestamp="2026-03-13 10:40:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:40:36.549419399 +0000 UTC m=+275.292244606" watchObservedRunningTime="2026-03-13 10:40:36.553672284 +0000 UTC m=+275.296497441" Mar 13 10:40:38.844926 master-0 kubenswrapper[7508]: I0313 10:40:38.844698 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:40:40.718067 master-0 kubenswrapper[7508]: I0313 10:40:40.718000 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:40:40.718795 master-0 kubenswrapper[7508]: I0313 10:40:40.718776 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:40:40.757719 master-0 kubenswrapper[7508]: I0313 10:40:40.757659 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:40:41.604638 master-0 kubenswrapper[7508]: I0313 10:40:41.604572 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:40:41.998581 master-0 kubenswrapper[7508]: I0313 10:40:41.998448 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:40:42.558044 master-0 kubenswrapper[7508]: I0313 10:40:42.557927 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:40:42.558400 master-0 kubenswrapper[7508]: I0313 10:40:42.558055 7508 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:42.558400 master-0 kubenswrapper[7508]: I0313 10:40:42.558083 7508 patch_prober.go:28] interesting pod/packageserver-85b658d7fb-45fq6 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:40:42.558400 master-0 kubenswrapper[7508]: I0313 10:40:42.558225 7508 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" podUID="97328e01-1227-417e-9af7-6426495d96db" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.68:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:44.998634 master-0 kubenswrapper[7508]: I0313 10:40:44.998538 7508 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:40:45.961241 master-0 kubenswrapper[7508]: I0313 10:40:45.961145 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:40:46.048521 master-0 kubenswrapper[7508]: I0313 10:40:46.048378 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.027806 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028209 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028252 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028280 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028287 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028298 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="046ee36d-4062-4c48-bab0-57381613b2ad" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028306 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="046ee36d-4062-4c48-bab0-57381613b2ad" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028318 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028325 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028336 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028343 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028353 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028360 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028376 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028384 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028392 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028400 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028409 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028418 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028429 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028435 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028444 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028451 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerName="extract-utilities" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028461 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028468 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: E0313 10:40:48.028477 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028485 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028631 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="61b83fd7-2b78-42a9-9d93-0be3fd59a679" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028653 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028670 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028679 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b072636-e46b-47f6-af85-3210e62bbd2d" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028698 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028722 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b57fa2d-b65e-4c69-97ce-4a379470d2de" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028740 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="20df9416-90f4-4c21-a3bc-c6e5f6622e15" containerName="extract-content" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028752 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.028763 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="046ee36d-4062-4c48-bab0-57381613b2ad" containerName="installer" Mar 13 10:40:48.030310 master-0 kubenswrapper[7508]: I0313 10:40:48.029377 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.038712 master-0 kubenswrapper[7508]: I0313 10:40:48.032684 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-z9gwk" Mar 13 10:40:48.038712 master-0 kubenswrapper[7508]: I0313 10:40:48.033019 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 10:40:48.047545 master-0 kubenswrapper[7508]: I0313 10:40:48.047479 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 13 10:40:48.145344 master-0 kubenswrapper[7508]: I0313 10:40:48.145261 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.145693 master-0 kubenswrapper[7508]: I0313 10:40:48.145398 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.145693 master-0 kubenswrapper[7508]: I0313 10:40:48.145466 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.247130 master-0 kubenswrapper[7508]: I0313 10:40:48.247023 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.247395 master-0 kubenswrapper[7508]: I0313 10:40:48.247290 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.247452 master-0 kubenswrapper[7508]: I0313 10:40:48.247306 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.247576 master-0 kubenswrapper[7508]: I0313 10:40:48.247527 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.247699 master-0 kubenswrapper[7508]: I0313 10:40:48.247626 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.277375 master-0 kubenswrapper[7508]: I0313 10:40:48.277283 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.375596 master-0 kubenswrapper[7508]: I0313 10:40:48.375489 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:40:48.886029 master-0 kubenswrapper[7508]: I0313 10:40:48.885946 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 13 10:40:48.896974 master-0 kubenswrapper[7508]: W0313 10:40:48.896904 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda7c07c6e_447f_4111_9d5a_b848fc3e1b2b.slice/crio-1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406 WatchSource:0}: Error finding container 1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406: Status 404 returned error can't find the container with id 1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406 Mar 13 10:40:49.034868 master-0 kubenswrapper[7508]: I0313 10:40:49.034798 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Mar 13 10:40:49.036368 master-0 kubenswrapper[7508]: I0313 10:40:49.035840 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.038069 master-0 kubenswrapper[7508]: I0313 10:40:49.038011 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:40:49.038460 master-0 kubenswrapper[7508]: I0313 10:40:49.038313 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:40:49.047858 master-0 kubenswrapper[7508]: I0313 10:40:49.047800 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Mar 13 10:40:49.084989 master-0 kubenswrapper[7508]: I0313 10:40:49.084932 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.085280 master-0 kubenswrapper[7508]: I0313 10:40:49.085004 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.085280 master-0 kubenswrapper[7508]: I0313 10:40:49.085236 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.186176 master-0 kubenswrapper[7508]: I0313 10:40:49.186073 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.186610 master-0 kubenswrapper[7508]: I0313 10:40:49.186559 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.186741 master-0 kubenswrapper[7508]: I0313 10:40:49.186597 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.186920 master-0 kubenswrapper[7508]: I0313 10:40:49.186637 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.187153 master-0 kubenswrapper[7508]: I0313 10:40:49.187138 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.206023 master-0 kubenswrapper[7508]: I0313 10:40:49.205921 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.381817 master-0 kubenswrapper[7508]: I0313 10:40:49.381760 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:40:49.632879 master-0 kubenswrapper[7508]: I0313 10:40:49.632807 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerStarted","Data":"00a4f5e044b3bb37309a0058cc340985271f0a9be303d372e70635d4947090aa"} Mar 13 10:40:49.632879 master-0 kubenswrapper[7508]: I0313 10:40:49.632877 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerStarted","Data":"1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406"} Mar 13 10:40:49.654143 master-0 kubenswrapper[7508]: I0313 10:40:49.653981 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.6539445179999999 podStartE2EDuration="1.653944518s" podCreationTimestamp="2026-03-13 10:40:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:40:49.649731084 +0000 UTC m=+288.392556201" watchObservedRunningTime="2026-03-13 10:40:49.653944518 +0000 UTC m=+288.396769645" Mar 13 10:40:49.879041 master-0 kubenswrapper[7508]: I0313 10:40:49.878944 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-retry-1-master-0"] Mar 13 10:40:50.643857 master-0 kubenswrapper[7508]: I0313 10:40:50.643651 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerStarted","Data":"4f57dbde7e6dd83a3f45d28b694622a3cd36e451a3d2e531b974cdf91eee3a45"} Mar 13 10:40:50.643857 master-0 kubenswrapper[7508]: I0313 10:40:50.643830 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerStarted","Data":"d4e74163544c10bf31d045c60068db268de2c869878f5f7b983afe24046cf63d"} Mar 13 10:40:51.566636 master-0 kubenswrapper[7508]: I0313 10:40:51.566574 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:40:51.590608 master-0 kubenswrapper[7508]: I0313 10:40:51.590519 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" podStartSLOduration=2.590494067 podStartE2EDuration="2.590494067s" podCreationTimestamp="2026-03-13 10:40:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:40:50.675576074 +0000 UTC m=+289.418401221" watchObservedRunningTime="2026-03-13 10:40:51.590494067 +0000 UTC m=+290.333319194" Mar 13 10:40:51.650613 master-0 kubenswrapper[7508]: I0313 10:40:51.650564 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-z7h4j_53da2840-4a92-497a-a9d3-973583887147/kube-controller-manager-operator/1.log" Mar 13 10:40:51.651275 master-0 kubenswrapper[7508]: I0313 10:40:51.650934 7508 generic.go:334] "Generic (PLEG): container finished" podID="53da2840-4a92-497a-a9d3-973583887147" containerID="023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493" exitCode=255 Mar 13 10:40:51.651275 master-0 kubenswrapper[7508]: I0313 10:40:51.651000 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerDied","Data":"023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493"} Mar 13 10:40:51.651275 master-0 kubenswrapper[7508]: I0313 10:40:51.651128 7508 scope.go:117] "RemoveContainer" containerID="c24269090669e540d849b1a7ede32ee9641b8d7335ec065d4a9e4c4317788e00" Mar 13 10:40:51.651492 master-0 kubenswrapper[7508]: I0313 10:40:51.651469 7508 scope.go:117] "RemoveContainer" containerID="023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493" Mar 13 10:40:51.651706 master-0 kubenswrapper[7508]: E0313 10:40:51.651677 7508 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-controller-manager-operator pod=kube-controller-manager-operator-86d7cdfdfb-z7h4j_openshift-kube-controller-manager-operator(53da2840-4a92-497a-a9d3-973583887147)\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" podUID="53da2840-4a92-497a-a9d3-973583887147" Mar 13 10:40:52.008416 master-0 kubenswrapper[7508]: I0313 10:40:52.008342 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:40:52.014218 master-0 kubenswrapper[7508]: I0313 10:40:52.014164 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:40:52.661992 master-0 kubenswrapper[7508]: I0313 10:40:52.661909 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-z7h4j_53da2840-4a92-497a-a9d3-973583887147/kube-controller-manager-operator/1.log" Mar 13 10:41:01.936035 master-0 kubenswrapper[7508]: I0313 10:41:01.935966 7508 scope.go:117] "RemoveContainer" containerID="34b7e36b0204fb75f5eaa9ffadb1e13d0888ef1773ea6fc2201df90d0a2dcd5e" Mar 13 10:41:04.500653 master-0 kubenswrapper[7508]: I0313 10:41:04.500572 7508 scope.go:117] "RemoveContainer" containerID="023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493" Mar 13 10:41:04.848115 master-0 kubenswrapper[7508]: I0313 10:41:04.848047 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-z7h4j_53da2840-4a92-497a-a9d3-973583887147/kube-controller-manager-operator/1.log" Mar 13 10:41:04.848345 master-0 kubenswrapper[7508]: I0313 10:41:04.848155 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"ffbff762b6947c8a6cf71150184bbac8a221faecb2335c23291939c8a280ac89"} Mar 13 10:41:09.778904 master-0 kubenswrapper[7508]: I0313 10:41:09.775345 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8"] Mar 13 10:41:09.778904 master-0 kubenswrapper[7508]: I0313 10:41:09.775420 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j"] Mar 13 10:41:09.778904 master-0 kubenswrapper[7508]: I0313 10:41:09.775739 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="cluster-cloud-controller-manager" containerID="cri-o://cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" gracePeriod=30 Mar 13 10:41:09.778904 master-0 kubenswrapper[7508]: I0313 10:41:09.776021 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="kube-rbac-proxy" containerID="cri-o://39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" gracePeriod=30 Mar 13 10:41:09.781834 master-0 kubenswrapper[7508]: I0313 10:41:09.781732 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="config-sync-controllers" containerID="cri-o://68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" gracePeriod=30 Mar 13 10:41:09.781936 master-0 kubenswrapper[7508]: I0313 10:41:09.781878 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" containerID="cri-o://04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" gracePeriod=30 Mar 13 10:41:09.782012 master-0 kubenswrapper[7508]: I0313 10:41:09.781980 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" containerID="cri-o://767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" gracePeriod=30 Mar 13 10:41:09.810308 master-0 kubenswrapper[7508]: I0313 10:41:09.809827 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-j9twr"] Mar 13 10:41:09.810912 master-0 kubenswrapper[7508]: I0313 10:41:09.810816 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.813169 master-0 kubenswrapper[7508]: I0313 10:41:09.813081 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-24kvc" Mar 13 10:41:09.830584 master-0 kubenswrapper[7508]: I0313 10:41:09.817522 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 10:41:09.847186 master-0 kubenswrapper[7508]: I0313 10:41:09.835733 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-6lqz5"] Mar 13 10:41:09.847186 master-0 kubenswrapper[7508]: I0313 10:41:09.836799 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:09.852304 master-0 kubenswrapper[7508]: I0313 10:41:09.849273 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-tvfvf" Mar 13 10:41:09.872203 master-0 kubenswrapper[7508]: I0313 10:41:09.866569 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-6lqz5"] Mar 13 10:41:09.896882 master-0 kubenswrapper[7508]: I0313 10:41:09.896800 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.896882 master-0 kubenswrapper[7508]: I0313 10:41:09.896881 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m5x2b\" (UniqueName: \"kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.897213 master-0 kubenswrapper[7508]: I0313 10:41:09.896934 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:09.897213 master-0 kubenswrapper[7508]: I0313 10:41:09.896991 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.897213 master-0 kubenswrapper[7508]: I0313 10:41:09.897021 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr5lp\" (UniqueName: \"kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:09.897213 master-0 kubenswrapper[7508]: I0313 10:41:09.897052 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.999745 master-0 kubenswrapper[7508]: I0313 10:41:09.999659 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:09.999986 master-0 kubenswrapper[7508]: I0313 10:41:09.999756 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.999986 master-0 kubenswrapper[7508]: I0313 10:41:09.999780 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5lp\" (UniqueName: \"kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:09.999986 master-0 kubenswrapper[7508]: I0313 10:41:09.999802 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.999986 master-0 kubenswrapper[7508]: I0313 10:41:09.999836 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:09.999986 master-0 kubenswrapper[7508]: I0313 10:41:09.999856 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5x2b\" (UniqueName: \"kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.006272 master-0 kubenswrapper[7508]: I0313 10:41:10.006225 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.006469 master-0 kubenswrapper[7508]: I0313 10:41:10.006282 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.013242 master-0 kubenswrapper[7508]: I0313 10:41:10.013188 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:10.017626 master-0 kubenswrapper[7508]: I0313 10:41:10.016604 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.029980 master-0 kubenswrapper[7508]: I0313 10:41:10.029847 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5x2b\" (UniqueName: \"kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.035898 master-0 kubenswrapper[7508]: I0313 10:41:10.035827 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:10.035898 master-0 kubenswrapper[7508]: I0313 10:41:10.035896 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5lp\" (UniqueName: \"kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:10.060944 master-0 kubenswrapper[7508]: I0313 10:41:10.060875 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:10.078821 master-0 kubenswrapper[7508]: I0313 10:41:10.078777 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-pbxm8_1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c/machine-approver-controller/0.log" Mar 13 10:41:10.079450 master-0 kubenswrapper[7508]: I0313 10:41:10.079423 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:41:10.080888 master-0 kubenswrapper[7508]: I0313 10:41:10.080858 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-vmt9j_d6fbad53-304a-4338-974e-d9974921c48f/kube-rbac-proxy/0.log" Mar 13 10:41:10.081650 master-0 kubenswrapper[7508]: I0313 10:41:10.081630 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106226 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube\") pod \"d6fbad53-304a-4338-974e-d9974921c48f\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106294 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls\") pod \"d6fbad53-304a-4338-974e-d9974921c48f\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106314 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pb688\" (UniqueName: \"kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688\") pod \"d6fbad53-304a-4338-974e-d9974921c48f\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106356 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config\") pod \"d6fbad53-304a-4338-974e-d9974921c48f\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106398 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config\") pod \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106417 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls\") pod \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106439 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images\") pod \"d6fbad53-304a-4338-974e-d9974921c48f\" (UID: \"d6fbad53-304a-4338-974e-d9974921c48f\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106464 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zc2r2\" (UniqueName: \"kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2\") pod \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " Mar 13 10:41:10.106623 master-0 kubenswrapper[7508]: I0313 10:41:10.106495 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config\") pod \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\" (UID: \"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c\") " Mar 13 10:41:10.108781 master-0 kubenswrapper[7508]: I0313 10:41:10.108738 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "d6fbad53-304a-4338-974e-d9974921c48f" (UID: "d6fbad53-304a-4338-974e-d9974921c48f"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:10.113547 master-0 kubenswrapper[7508]: I0313 10:41:10.110801 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" (UID: "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:41:10.113547 master-0 kubenswrapper[7508]: I0313 10:41:10.111871 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "d6fbad53-304a-4338-974e-d9974921c48f" (UID: "d6fbad53-304a-4338-974e-d9974921c48f"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:41:10.113547 master-0 kubenswrapper[7508]: I0313 10:41:10.112078 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2" (OuterVolumeSpecName: "kube-api-access-zc2r2") pod "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" (UID: "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c"). InnerVolumeSpecName "kube-api-access-zc2r2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:10.113547 master-0 kubenswrapper[7508]: I0313 10:41:10.112545 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config" (OuterVolumeSpecName: "config") pod "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" (UID: "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:41:10.113547 master-0 kubenswrapper[7508]: I0313 10:41:10.112725 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images" (OuterVolumeSpecName: "images") pod "d6fbad53-304a-4338-974e-d9974921c48f" (UID: "d6fbad53-304a-4338-974e-d9974921c48f"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:41:10.114589 master-0 kubenswrapper[7508]: I0313 10:41:10.114532 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "d6fbad53-304a-4338-974e-d9974921c48f" (UID: "d6fbad53-304a-4338-974e-d9974921c48f"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:41:10.114872 master-0 kubenswrapper[7508]: I0313 10:41:10.114797 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" (UID: "1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:41:10.116830 master-0 kubenswrapper[7508]: I0313 10:41:10.116777 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688" (OuterVolumeSpecName: "kube-api-access-pb688") pod "d6fbad53-304a-4338-974e-d9974921c48f" (UID: "d6fbad53-304a-4338-974e-d9974921c48f"). InnerVolumeSpecName "kube-api-access-pb688". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208445 7508 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6fbad53-304a-4338-974e-d9974921c48f-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208485 7508 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/d6fbad53-304a-4338-974e-d9974921c48f-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208499 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pb688\" (UniqueName: \"kubernetes.io/projected/d6fbad53-304a-4338-974e-d9974921c48f-kube-api-access-pb688\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208508 7508 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208518 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208529 7508 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208537 7508 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/d6fbad53-304a-4338-974e-d9974921c48f-images\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208545 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zc2r2\" (UniqueName: \"kubernetes.io/projected/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-kube-api-access-zc2r2\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.209640 master-0 kubenswrapper[7508]: I0313 10:41:10.208555 7508 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:10.518328 master-0 kubenswrapper[7508]: I0313 10:41:10.518195 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-6lqz5"] Mar 13 10:41:10.730814 master-0 kubenswrapper[7508]: I0313 10:41:10.730762 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-559568b945-vmt9j_d6fbad53-304a-4338-974e-d9974921c48f/kube-rbac-proxy/0.log" Mar 13 10:41:10.731688 master-0 kubenswrapper[7508]: I0313 10:41:10.731636 7508 generic.go:334] "Generic (PLEG): container finished" podID="d6fbad53-304a-4338-974e-d9974921c48f" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" exitCode=0 Mar 13 10:41:10.731688 master-0 kubenswrapper[7508]: I0313 10:41:10.731688 7508 generic.go:334] "Generic (PLEG): container finished" podID="d6fbad53-304a-4338-974e-d9974921c48f" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" exitCode=0 Mar 13 10:41:10.731824 master-0 kubenswrapper[7508]: I0313 10:41:10.731698 7508 generic.go:334] "Generic (PLEG): container finished" podID="d6fbad53-304a-4338-974e-d9974921c48f" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" exitCode=0 Mar 13 10:41:10.731824 master-0 kubenswrapper[7508]: I0313 10:41:10.731701 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerDied","Data":"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c"} Mar 13 10:41:10.731824 master-0 kubenswrapper[7508]: I0313 10:41:10.731761 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" Mar 13 10:41:10.731824 master-0 kubenswrapper[7508]: I0313 10:41:10.731798 7508 scope.go:117] "RemoveContainer" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" Mar 13 10:41:10.732283 master-0 kubenswrapper[7508]: I0313 10:41:10.731779 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerDied","Data":"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030"} Mar 13 10:41:10.732283 master-0 kubenswrapper[7508]: I0313 10:41:10.732247 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerDied","Data":"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309"} Mar 13 10:41:10.732283 master-0 kubenswrapper[7508]: I0313 10:41:10.732275 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j" event={"ID":"d6fbad53-304a-4338-974e-d9974921c48f","Type":"ContainerDied","Data":"b399c8bc734d16f4c258d0605a39203e9489484fa48d09e79fa8aa138647119c"} Mar 13 10:41:10.746432 master-0 kubenswrapper[7508]: I0313 10:41:10.744435 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"77ae6dbbf39c4d2991c10b142e9d6fe23b3ada856897b7bc34aa3b7d69fa418b"} Mar 13 10:41:10.751870 master-0 kubenswrapper[7508]: I0313 10:41:10.751734 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"1001e4a8a4042183edf1d1d087bc112421eacf94e38f3e35de4c5170d3dca5be"} Mar 13 10:41:10.751870 master-0 kubenswrapper[7508]: I0313 10:41:10.751789 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"fd5397c516d4a5473893c96f67f300df25fb73d79280ce5bc95242d87f0224a1"} Mar 13 10:41:10.751870 master-0 kubenswrapper[7508]: I0313 10:41:10.751802 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"9a9692d62aeb99fb7d4d3fc80637ffdf1ea3947790e26d640f42aacc16302c11"} Mar 13 10:41:10.756562 master-0 kubenswrapper[7508]: I0313 10:41:10.756384 7508 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-pbxm8_1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c/machine-approver-controller/0.log" Mar 13 10:41:10.762241 master-0 kubenswrapper[7508]: I0313 10:41:10.761315 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" Mar 13 10:41:10.762437 master-0 kubenswrapper[7508]: I0313 10:41:10.762366 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerDied","Data":"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0"} Mar 13 10:41:10.762658 master-0 kubenswrapper[7508]: I0313 10:41:10.762550 7508 generic.go:334] "Generic (PLEG): container finished" podID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerID="04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" exitCode=0 Mar 13 10:41:10.762658 master-0 kubenswrapper[7508]: I0313 10:41:10.762581 7508 generic.go:334] "Generic (PLEG): container finished" podID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerID="39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" exitCode=0 Mar 13 10:41:10.762658 master-0 kubenswrapper[7508]: I0313 10:41:10.762601 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerDied","Data":"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba"} Mar 13 10:41:10.762886 master-0 kubenswrapper[7508]: I0313 10:41:10.762849 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8" event={"ID":"1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c","Type":"ContainerDied","Data":"31634e1fa2a526a5eef76adce598a8e242bdd09cd3c5df9b79281ebf5788e31f"} Mar 13 10:41:10.789015 master-0 kubenswrapper[7508]: I0313 10:41:10.783216 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" podStartSLOduration=2.783192232 podStartE2EDuration="2.783192232s" podCreationTimestamp="2026-03-13 10:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:10.780261976 +0000 UTC m=+309.523087123" watchObservedRunningTime="2026-03-13 10:41:10.783192232 +0000 UTC m=+309.526017339" Mar 13 10:41:10.789015 master-0 kubenswrapper[7508]: I0313 10:41:10.783703 7508 scope.go:117] "RemoveContainer" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:41:10.823361 master-0 kubenswrapper[7508]: I0313 10:41:10.821133 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j"] Mar 13 10:41:10.830946 master-0 kubenswrapper[7508]: I0313 10:41:10.830876 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-vmt9j"] Mar 13 10:41:10.840139 master-0 kubenswrapper[7508]: I0313 10:41:10.839948 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8"] Mar 13 10:41:10.850213 master-0 kubenswrapper[7508]: I0313 10:41:10.850109 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-pbxm8"] Mar 13 10:41:10.857783 master-0 kubenswrapper[7508]: I0313 10:41:10.857745 7508 scope.go:117] "RemoveContainer" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" Mar 13 10:41:10.861600 master-0 kubenswrapper[7508]: I0313 10:41:10.861551 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n"] Mar 13 10:41:10.861834 master-0 kubenswrapper[7508]: E0313 10:41:10.861813 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: I0313 10:41:10.861836 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: E0313 10:41:10.861855 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="kube-rbac-proxy" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: I0313 10:41:10.861864 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="kube-rbac-proxy" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: E0313 10:41:10.861890 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="config-sync-controllers" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: I0313 10:41:10.861900 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="config-sync-controllers" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: E0313 10:41:10.861909 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.861915 master-0 kubenswrapper[7508]: I0313 10:41:10.861916 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.862280 master-0 kubenswrapper[7508]: E0313 10:41:10.861927 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="cluster-cloud-controller-manager" Mar 13 10:41:10.862280 master-0 kubenswrapper[7508]: I0313 10:41:10.861936 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="cluster-cloud-controller-manager" Mar 13 10:41:10.862280 master-0 kubenswrapper[7508]: E0313 10:41:10.861947 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.862280 master-0 kubenswrapper[7508]: I0313 10:41:10.861954 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.862280 master-0 kubenswrapper[7508]: I0313 10:41:10.862075 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863378 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="kube-rbac-proxy" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863413 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863434 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" containerName="machine-approver-controller" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863445 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="config-sync-controllers" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863454 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="cluster-cloud-controller-manager" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: E0313 10:41:10.863592 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863605 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.863746 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6fbad53-304a-4338-974e-d9974921c48f" containerName="kube-rbac-proxy" Mar 13 10:41:10.869237 master-0 kubenswrapper[7508]: I0313 10:41:10.868407 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.871632 master-0 kubenswrapper[7508]: I0313 10:41:10.871264 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:41:10.871632 master-0 kubenswrapper[7508]: I0313 10:41:10.871610 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:41:10.871776 master-0 kubenswrapper[7508]: I0313 10:41:10.871769 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:41:10.874889 master-0 kubenswrapper[7508]: I0313 10:41:10.874843 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:41:10.875119 master-0 kubenswrapper[7508]: I0313 10:41:10.874973 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-5pbvv" Mar 13 10:41:10.875119 master-0 kubenswrapper[7508]: I0313 10:41:10.875060 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:41:10.898816 master-0 kubenswrapper[7508]: I0313 10:41:10.898182 7508 scope.go:117] "RemoveContainer" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" Mar 13 10:41:10.915322 master-0 kubenswrapper[7508]: I0313 10:41:10.915275 7508 scope.go:117] "RemoveContainer" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" Mar 13 10:41:10.915960 master-0 kubenswrapper[7508]: E0313 10:41:10.915855 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": container with ID starting with 767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c not found: ID does not exist" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" Mar 13 10:41:10.915960 master-0 kubenswrapper[7508]: I0313 10:41:10.915928 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c"} err="failed to get container status \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": rpc error: code = NotFound desc = could not find container \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": container with ID starting with 767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c not found: ID does not exist" Mar 13 10:41:10.916070 master-0 kubenswrapper[7508]: I0313 10:41:10.915967 7508 scope.go:117] "RemoveContainer" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:41:10.916445 master-0 kubenswrapper[7508]: E0313 10:41:10.916387 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": container with ID starting with 50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6 not found: ID does not exist" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:41:10.916507 master-0 kubenswrapper[7508]: I0313 10:41:10.916447 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6"} err="failed to get container status \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": rpc error: code = NotFound desc = could not find container \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": container with ID starting with 50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6 not found: ID does not exist" Mar 13 10:41:10.916507 master-0 kubenswrapper[7508]: I0313 10:41:10.916481 7508 scope.go:117] "RemoveContainer" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" Mar 13 10:41:10.916860 master-0 kubenswrapper[7508]: E0313 10:41:10.916824 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": container with ID starting with 68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030 not found: ID does not exist" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" Mar 13 10:41:10.916917 master-0 kubenswrapper[7508]: I0313 10:41:10.916863 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030"} err="failed to get container status \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": rpc error: code = NotFound desc = could not find container \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": container with ID starting with 68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030 not found: ID does not exist" Mar 13 10:41:10.916917 master-0 kubenswrapper[7508]: I0313 10:41:10.916888 7508 scope.go:117] "RemoveContainer" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" Mar 13 10:41:10.917227 master-0 kubenswrapper[7508]: E0313 10:41:10.917192 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": container with ID starting with cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309 not found: ID does not exist" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" Mar 13 10:41:10.917278 master-0 kubenswrapper[7508]: I0313 10:41:10.917227 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309"} err="failed to get container status \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": rpc error: code = NotFound desc = could not find container \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": container with ID starting with cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309 not found: ID does not exist" Mar 13 10:41:10.917278 master-0 kubenswrapper[7508]: I0313 10:41:10.917252 7508 scope.go:117] "RemoveContainer" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" Mar 13 10:41:10.917552 master-0 kubenswrapper[7508]: I0313 10:41:10.917491 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c"} err="failed to get container status \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": rpc error: code = NotFound desc = could not find container \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": container with ID starting with 767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c not found: ID does not exist" Mar 13 10:41:10.917552 master-0 kubenswrapper[7508]: I0313 10:41:10.917543 7508 scope.go:117] "RemoveContainer" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:41:10.917886 master-0 kubenswrapper[7508]: I0313 10:41:10.917840 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6"} err="failed to get container status \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": rpc error: code = NotFound desc = could not find container \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": container with ID starting with 50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6 not found: ID does not exist" Mar 13 10:41:10.917886 master-0 kubenswrapper[7508]: I0313 10:41:10.917880 7508 scope.go:117] "RemoveContainer" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" Mar 13 10:41:10.918314 master-0 kubenswrapper[7508]: I0313 10:41:10.918272 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030"} err="failed to get container status \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": rpc error: code = NotFound desc = could not find container \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": container with ID starting with 68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030 not found: ID does not exist" Mar 13 10:41:10.918314 master-0 kubenswrapper[7508]: I0313 10:41:10.918309 7508 scope.go:117] "RemoveContainer" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" Mar 13 10:41:10.918835 master-0 kubenswrapper[7508]: I0313 10:41:10.918599 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309"} err="failed to get container status \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": rpc error: code = NotFound desc = could not find container \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": container with ID starting with cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309 not found: ID does not exist" Mar 13 10:41:10.918835 master-0 kubenswrapper[7508]: I0313 10:41:10.918632 7508 scope.go:117] "RemoveContainer" containerID="767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919018 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c"} err="failed to get container status \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": rpc error: code = NotFound desc = could not find container \"767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c\": container with ID starting with 767bdaba5d3b495520417ba034e464a212cd3eb3bfd2a984e5ca8d3baf9ac72c not found: ID does not exist" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919037 7508 scope.go:117] "RemoveContainer" containerID="50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919290 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6"} err="failed to get container status \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": rpc error: code = NotFound desc = could not find container \"50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6\": container with ID starting with 50eec19ad0fc54417b94dac248db18ed527b476dadcd232285f0707578ddb5a6 not found: ID does not exist" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919315 7508 scope.go:117] "RemoveContainer" containerID="68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919577 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030"} err="failed to get container status \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": rpc error: code = NotFound desc = could not find container \"68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030\": container with ID starting with 68d03837ee58ba957be8d4cc140a354489b94bb60d9b9a50ac948712910ec030 not found: ID does not exist" Mar 13 10:41:10.919656 master-0 kubenswrapper[7508]: I0313 10:41:10.919595 7508 scope.go:117] "RemoveContainer" containerID="cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309" Mar 13 10:41:10.919876 master-0 kubenswrapper[7508]: I0313 10:41:10.919852 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309"} err="failed to get container status \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": rpc error: code = NotFound desc = could not find container \"cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309\": container with ID starting with cb918200cbb80069580658fac88bbfe04b504c2aa2cdddb73d929532c8694309 not found: ID does not exist" Mar 13 10:41:10.919926 master-0 kubenswrapper[7508]: I0313 10:41:10.919877 7508 scope.go:117] "RemoveContainer" containerID="04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" Mar 13 10:41:10.926062 master-0 kubenswrapper[7508]: I0313 10:41:10.924458 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9cxp\" (UniqueName: \"kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.926062 master-0 kubenswrapper[7508]: I0313 10:41:10.924501 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.926062 master-0 kubenswrapper[7508]: I0313 10:41:10.924612 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.926062 master-0 kubenswrapper[7508]: I0313 10:41:10.924650 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.926062 master-0 kubenswrapper[7508]: I0313 10:41:10.924692 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:10.943012 master-0 kubenswrapper[7508]: I0313 10:41:10.942979 7508 scope.go:117] "RemoveContainer" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" Mar 13 10:41:10.967475 master-0 kubenswrapper[7508]: I0313 10:41:10.967423 7508 scope.go:117] "RemoveContainer" containerID="39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" Mar 13 10:41:10.967794 master-0 kubenswrapper[7508]: I0313 10:41:10.967760 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv"] Mar 13 10:41:10.968804 master-0 kubenswrapper[7508]: I0313 10:41:10.968775 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:10.970651 master-0 kubenswrapper[7508]: I0313 10:41:10.970617 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-ff7d6" Mar 13 10:41:10.971270 master-0 kubenswrapper[7508]: I0313 10:41:10.971238 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:41:10.971457 master-0 kubenswrapper[7508]: I0313 10:41:10.971436 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:41:10.971658 master-0 kubenswrapper[7508]: I0313 10:41:10.971640 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:41:10.971831 master-0 kubenswrapper[7508]: I0313 10:41:10.971813 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:41:10.973968 master-0 kubenswrapper[7508]: I0313 10:41:10.973923 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:41:10.993859 master-0 kubenswrapper[7508]: I0313 10:41:10.993810 7508 scope.go:117] "RemoveContainer" containerID="04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" Mar 13 10:41:10.995720 master-0 kubenswrapper[7508]: E0313 10:41:10.995341 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0\": container with ID starting with 04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0 not found: ID does not exist" containerID="04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" Mar 13 10:41:10.995720 master-0 kubenswrapper[7508]: I0313 10:41:10.995389 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0"} err="failed to get container status \"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0\": rpc error: code = NotFound desc = could not find container \"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0\": container with ID starting with 04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0 not found: ID does not exist" Mar 13 10:41:10.995720 master-0 kubenswrapper[7508]: I0313 10:41:10.995418 7508 scope.go:117] "RemoveContainer" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" Mar 13 10:41:10.996402 master-0 kubenswrapper[7508]: E0313 10:41:10.996361 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9\": container with ID starting with 7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9 not found: ID does not exist" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" Mar 13 10:41:10.996402 master-0 kubenswrapper[7508]: I0313 10:41:10.996391 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9"} err="failed to get container status \"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9\": rpc error: code = NotFound desc = could not find container \"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9\": container with ID starting with 7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9 not found: ID does not exist" Mar 13 10:41:10.996402 master-0 kubenswrapper[7508]: I0313 10:41:10.996405 7508 scope.go:117] "RemoveContainer" containerID="39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" Mar 13 10:41:10.996704 master-0 kubenswrapper[7508]: E0313 10:41:10.996665 7508 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba\": container with ID starting with 39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba not found: ID does not exist" containerID="39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" Mar 13 10:41:10.996771 master-0 kubenswrapper[7508]: I0313 10:41:10.996701 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba"} err="failed to get container status \"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba\": rpc error: code = NotFound desc = could not find container \"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba\": container with ID starting with 39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba not found: ID does not exist" Mar 13 10:41:10.996771 master-0 kubenswrapper[7508]: I0313 10:41:10.996726 7508 scope.go:117] "RemoveContainer" containerID="04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0" Mar 13 10:41:10.997069 master-0 kubenswrapper[7508]: I0313 10:41:10.997004 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0"} err="failed to get container status \"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0\": rpc error: code = NotFound desc = could not find container \"04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0\": container with ID starting with 04e09fa8c186a876a9c2c7a0648ecaf9ee41e68a6536460d6cc13ec83264a3f0 not found: ID does not exist" Mar 13 10:41:10.997140 master-0 kubenswrapper[7508]: I0313 10:41:10.997066 7508 scope.go:117] "RemoveContainer" containerID="7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9" Mar 13 10:41:10.997728 master-0 kubenswrapper[7508]: I0313 10:41:10.997670 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9"} err="failed to get container status \"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9\": rpc error: code = NotFound desc = could not find container \"7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9\": container with ID starting with 7f6525ff5128537603a69336d7c7fd9cf821137e892a9302babca040061d22c9 not found: ID does not exist" Mar 13 10:41:10.997728 master-0 kubenswrapper[7508]: I0313 10:41:10.997707 7508 scope.go:117] "RemoveContainer" containerID="39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba" Mar 13 10:41:10.998752 master-0 kubenswrapper[7508]: I0313 10:41:10.998716 7508 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba"} err="failed to get container status \"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba\": rpc error: code = NotFound desc = could not find container \"39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba\": container with ID starting with 39197e197d762deba0d363bcc6896638ef2b38e22e6a8c772b577423fbaffeba not found: ID does not exist" Mar 13 10:41:11.025891 master-0 kubenswrapper[7508]: I0313 10:41:11.025819 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.025891 master-0 kubenswrapper[7508]: I0313 10:41:11.025876 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9cxp\" (UniqueName: \"kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.025891 master-0 kubenswrapper[7508]: I0313 10:41:11.025899 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.025920 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026087 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmf6l\" (UniqueName: \"kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026211 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026253 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026642 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026662 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.026828 master-0 kubenswrapper[7508]: I0313 10:41:11.026816 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.030421 master-0 kubenswrapper[7508]: I0313 10:41:11.027996 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.030421 master-0 kubenswrapper[7508]: I0313 10:41:11.027999 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.031575 master-0 kubenswrapper[7508]: I0313 10:41:11.031128 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.047894 master-0 kubenswrapper[7508]: I0313 10:41:11.046459 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9cxp\" (UniqueName: \"kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.127948 master-0 kubenswrapper[7508]: I0313 10:41:11.127878 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.128187 master-0 kubenswrapper[7508]: I0313 10:41:11.128147 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.128451 master-0 kubenswrapper[7508]: I0313 10:41:11.128406 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmf6l\" (UniqueName: \"kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.128543 master-0 kubenswrapper[7508]: I0313 10:41:11.128522 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.128782 master-0 kubenswrapper[7508]: I0313 10:41:11.128734 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.129299 master-0 kubenswrapper[7508]: I0313 10:41:11.129267 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.131784 master-0 kubenswrapper[7508]: I0313 10:41:11.131737 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.148346 master-0 kubenswrapper[7508]: I0313 10:41:11.148302 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmf6l\" (UniqueName: \"kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.196349 master-0 kubenswrapper[7508]: I0313 10:41:11.196291 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:11.320705 master-0 kubenswrapper[7508]: I0313 10:41:11.320573 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:11.346826 master-0 kubenswrapper[7508]: W0313 10:41:11.346759 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8dc7af5f_ff72_4f06_88df_a26ff4c0bded.slice/crio-0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5 WatchSource:0}: Error finding container 0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5: Status 404 returned error can't find the container with id 0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5 Mar 13 10:41:11.511286 master-0 kubenswrapper[7508]: I0313 10:41:11.509597 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c" path="/var/lib/kubelet/pods/1891d6ce-1e9b-4a04-9ccd-4dfbf343f78c/volumes" Mar 13 10:41:11.511286 master-0 kubenswrapper[7508]: I0313 10:41:11.510735 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6fbad53-304a-4338-974e-d9974921c48f" path="/var/lib/kubelet/pods/d6fbad53-304a-4338-974e-d9974921c48f/volumes" Mar 13 10:41:11.788579 master-0 kubenswrapper[7508]: I0313 10:41:11.788511 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"48bcb8e3556650cfae3adfa0ba5f6b7611552bc7f1f0e3120408fbfc9691ca6f"} Mar 13 10:41:11.788579 master-0 kubenswrapper[7508]: I0313 10:41:11.788577 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5"} Mar 13 10:41:11.792875 master-0 kubenswrapper[7508]: I0313 10:41:11.792832 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"b449d051473ff9974acc080b10607f0bdeb8e4b0dbbbfc4c1bde4f8d09a30cfb"} Mar 13 10:41:11.793287 master-0 kubenswrapper[7508]: I0313 10:41:11.792895 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"6b457fca38abf31ca20d44610b680f150e7060cd35d43f544ed341cc62e726d2"} Mar 13 10:41:11.793287 master-0 kubenswrapper[7508]: I0313 10:41:11.792909 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"fe76c4da023ee8241529e5f2a6a092dc48a1a51d30db462a00bc458437ba96ee"} Mar 13 10:41:11.800085 master-0 kubenswrapper[7508]: I0313 10:41:11.800057 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f"} Mar 13 10:41:11.800085 master-0 kubenswrapper[7508]: I0313 10:41:11.800113 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"2374456736ebc7d72463b6654e06d916657c29a267fba9a956c950f521d8de03"} Mar 13 10:41:11.824050 master-0 kubenswrapper[7508]: I0313 10:41:11.823935 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" podStartSLOduration=2.82391186 podStartE2EDuration="2.82391186s" podCreationTimestamp="2026-03-13 10:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:11.821686595 +0000 UTC m=+310.564511722" watchObservedRunningTime="2026-03-13 10:41:11.82391186 +0000 UTC m=+310.566736977" Mar 13 10:41:11.855383 master-0 kubenswrapper[7508]: I0313 10:41:11.854123 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:41:11.855383 master-0 kubenswrapper[7508]: I0313 10:41:11.854440 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="multus-admission-controller" containerID="cri-o://2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3" gracePeriod=30 Mar 13 10:41:11.855383 master-0 kubenswrapper[7508]: I0313 10:41:11.855248 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="kube-rbac-proxy" containerID="cri-o://a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4" gracePeriod=30 Mar 13 10:41:12.820516 master-0 kubenswrapper[7508]: I0313 10:41:12.820447 7508 generic.go:334] "Generic (PLEG): container finished" podID="5da919b6-8545-4001-89f3-74cb289327f0" containerID="a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4" exitCode=0 Mar 13 10:41:12.821146 master-0 kubenswrapper[7508]: I0313 10:41:12.820525 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerDied","Data":"a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4"} Mar 13 10:41:12.823393 master-0 kubenswrapper[7508]: I0313 10:41:12.823331 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"f9193ce0cecc29a04837d4cc5243527b46397232b9255d51f28db25efcba2a5f"} Mar 13 10:41:12.825780 master-0 kubenswrapper[7508]: I0313 10:41:12.825732 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"12d7699651508d757bad08ce9a02fbaf1b9a7210ca40bc453b12412cce05999a"} Mar 13 10:41:12.838482 master-0 kubenswrapper[7508]: I0313 10:41:12.838408 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" podStartSLOduration=2.838388138 podStartE2EDuration="2.838388138s" podCreationTimestamp="2026-03-13 10:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:12.837872573 +0000 UTC m=+311.580697690" watchObservedRunningTime="2026-03-13 10:41:12.838388138 +0000 UTC m=+311.581213255" Mar 13 10:41:14.389081 master-0 kubenswrapper[7508]: I0313 10:41:14.388956 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" podStartSLOduration=4.388925166 podStartE2EDuration="4.388925166s" podCreationTimestamp="2026-03-13 10:41:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:12.86200948 +0000 UTC m=+311.604834637" watchObservedRunningTime="2026-03-13 10:41:14.388925166 +0000 UTC m=+313.131750293" Mar 13 10:41:14.393829 master-0 kubenswrapper[7508]: I0313 10:41:14.393773 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp"] Mar 13 10:41:14.395218 master-0 kubenswrapper[7508]: I0313 10:41:14.395142 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.397157 master-0 kubenswrapper[7508]: I0313 10:41:14.397065 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-jpsrw" Mar 13 10:41:14.397373 master-0 kubenswrapper[7508]: I0313 10:41:14.397314 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 10:41:14.409991 master-0 kubenswrapper[7508]: I0313 10:41:14.409942 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp"] Mar 13 10:41:14.572488 master-0 kubenswrapper[7508]: I0313 10:41:14.572420 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.572722 master-0 kubenswrapper[7508]: I0313 10:41:14.572521 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rqms\" (UniqueName: \"kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.572722 master-0 kubenswrapper[7508]: I0313 10:41:14.572570 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.674525 master-0 kubenswrapper[7508]: I0313 10:41:14.674176 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.674525 master-0 kubenswrapper[7508]: I0313 10:41:14.674285 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqms\" (UniqueName: \"kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.674525 master-0 kubenswrapper[7508]: I0313 10:41:14.674349 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.676325 master-0 kubenswrapper[7508]: I0313 10:41:14.676249 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.681012 master-0 kubenswrapper[7508]: I0313 10:41:14.680936 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.705501 master-0 kubenswrapper[7508]: I0313 10:41:14.705440 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqms\" (UniqueName: \"kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:14.723558 master-0 kubenswrapper[7508]: I0313 10:41:14.723424 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:15.145513 master-0 kubenswrapper[7508]: I0313 10:41:15.145436 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp"] Mar 13 10:41:15.147427 master-0 kubenswrapper[7508]: W0313 10:41:15.147355 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bbcde8d_4c56_4ef7_9fe5_f0ceebb1e65e.slice/crio-d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d WatchSource:0}: Error finding container d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d: Status 404 returned error can't find the container with id d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d Mar 13 10:41:15.448949 master-0 kubenswrapper[7508]: I0313 10:41:15.448880 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp"] Mar 13 10:41:15.449761 master-0 kubenswrapper[7508]: I0313 10:41:15.449731 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:15.452391 master-0 kubenswrapper[7508]: I0313 10:41:15.452339 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 10:41:15.453994 master-0 kubenswrapper[7508]: I0313 10:41:15.453951 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-mbkch"] Mar 13 10:41:15.454928 master-0 kubenswrapper[7508]: I0313 10:41:15.454900 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.456710 master-0 kubenswrapper[7508]: I0313 10:41:15.456675 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv"] Mar 13 10:41:15.457281 master-0 kubenswrapper[7508]: I0313 10:41:15.457261 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:15.458731 master-0 kubenswrapper[7508]: I0313 10:41:15.458685 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 10:41:15.459015 master-0 kubenswrapper[7508]: I0313 10:41:15.458973 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 10:41:15.459246 master-0 kubenswrapper[7508]: I0313 10:41:15.459213 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 10:41:15.460137 master-0 kubenswrapper[7508]: I0313 10:41:15.460055 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 10:41:15.460331 master-0 kubenswrapper[7508]: I0313 10:41:15.460286 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 10:41:15.464750 master-0 kubenswrapper[7508]: I0313 10:41:15.464703 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp"] Mar 13 10:41:15.468677 master-0 kubenswrapper[7508]: I0313 10:41:15.468645 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 10:41:15.483621 master-0 kubenswrapper[7508]: I0313 10:41:15.483568 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7z94w"] Mar 13 10:41:15.484298 master-0 kubenswrapper[7508]: I0313 10:41:15.484275 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.486965 master-0 kubenswrapper[7508]: I0313 10:41:15.486401 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-smz8l" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.488834 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv"] Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489085 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489142 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-85vcp\" (UID: \"258f571e-5ec8-42df-b4ba-17457d87d10d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489161 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489176 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489191 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489220 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489236 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8z5fj\" (UniqueName: \"kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489255 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489283 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489302 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr9x5\" (UniqueName: \"kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5\") pod \"network-check-source-7c67b67d47-zxjfv\" (UID: \"b460735c-56aa-4dd3-a756-759859083e12\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:15.490114 master-0 kubenswrapper[7508]: I0313 10:41:15.489320 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.490810 master-0 kubenswrapper[7508]: I0313 10:41:15.490791 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 13 10:41:15.590284 master-0 kubenswrapper[7508]: I0313 10:41:15.590224 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.590284 master-0 kubenswrapper[7508]: I0313 10:41:15.590272 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.590284 master-0 kubenswrapper[7508]: I0313 10:41:15.590295 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.590614 master-0 kubenswrapper[7508]: I0313 10:41:15.590426 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.590614 master-0 kubenswrapper[7508]: I0313 10:41:15.590446 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z5fj\" (UniqueName: \"kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.590614 master-0 kubenswrapper[7508]: I0313 10:41:15.590470 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.590614 master-0 kubenswrapper[7508]: I0313 10:41:15.590597 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.590731 master-0 kubenswrapper[7508]: I0313 10:41:15.590616 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9x5\" (UniqueName: \"kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5\") pod \"network-check-source-7c67b67d47-zxjfv\" (UID: \"b460735c-56aa-4dd3-a756-759859083e12\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:15.590731 master-0 kubenswrapper[7508]: I0313 10:41:15.590637 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.590731 master-0 kubenswrapper[7508]: I0313 10:41:15.590665 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.590731 master-0 kubenswrapper[7508]: I0313 10:41:15.590700 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-85vcp\" (UID: \"258f571e-5ec8-42df-b4ba-17457d87d10d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:15.593659 master-0 kubenswrapper[7508]: I0313 10:41:15.593618 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.593736 master-0 kubenswrapper[7508]: I0313 10:41:15.593717 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.593863 master-0 kubenswrapper[7508]: I0313 10:41:15.593838 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-85vcp\" (UID: \"258f571e-5ec8-42df-b4ba-17457d87d10d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:15.594182 master-0 kubenswrapper[7508]: I0313 10:41:15.594154 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.594376 master-0 kubenswrapper[7508]: I0313 10:41:15.594350 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.594756 master-0 kubenswrapper[7508]: I0313 10:41:15.594731 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.595052 master-0 kubenswrapper[7508]: I0313 10:41:15.595025 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.595189 master-0 kubenswrapper[7508]: I0313 10:41:15.595165 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.607853 master-0 kubenswrapper[7508]: I0313 10:41:15.607795 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9x5\" (UniqueName: \"kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5\") pod \"network-check-source-7c67b67d47-zxjfv\" (UID: \"b460735c-56aa-4dd3-a756-759859083e12\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:15.609577 master-0 kubenswrapper[7508]: I0313 10:41:15.609547 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.609799 master-0 kubenswrapper[7508]: I0313 10:41:15.609763 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z5fj\" (UniqueName: \"kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.769316 master-0 kubenswrapper[7508]: I0313 10:41:15.769189 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:15.782119 master-0 kubenswrapper[7508]: I0313 10:41:15.782065 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:15.811131 master-0 kubenswrapper[7508]: I0313 10:41:15.810931 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:15.813071 master-0 kubenswrapper[7508]: I0313 10:41:15.812810 7508 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:41:15.828489 master-0 kubenswrapper[7508]: I0313 10:41:15.828440 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:15.858741 master-0 kubenswrapper[7508]: I0313 10:41:15.858686 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"d8db377380cc25a98f74177a2d972c0aadff0f1684a6e93080f90cae3a912f32"} Mar 13 10:41:15.858741 master-0 kubenswrapper[7508]: I0313 10:41:15.858746 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"e669989fc04a3bbaaf8170906f7d49c5660764cd591eb569492010fe67858c9f"} Mar 13 10:41:15.858965 master-0 kubenswrapper[7508]: I0313 10:41:15.858763 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d"} Mar 13 10:41:15.862401 master-0 kubenswrapper[7508]: I0313 10:41:15.861038 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" event={"ID":"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e","Type":"ContainerStarted","Data":"af7a768842b9cbb587f10537824efb3089e2d3b4f70fb674c1d644bca3af49d7"} Mar 13 10:41:15.915224 master-0 kubenswrapper[7508]: I0313 10:41:15.912861 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" podStartSLOduration=1.912835182 podStartE2EDuration="1.912835182s" podCreationTimestamp="2026-03-13 10:41:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:15.892341551 +0000 UTC m=+314.635166668" watchObservedRunningTime="2026-03-13 10:41:15.912835182 +0000 UTC m=+314.655660299" Mar 13 10:41:16.003036 master-0 kubenswrapper[7508]: I0313 10:41:16.003004 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp"] Mar 13 10:41:16.014650 master-0 kubenswrapper[7508]: W0313 10:41:16.014613 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258f571e_5ec8_42df_b4ba_17457d87d10d.slice/crio-c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa WatchSource:0}: Error finding container c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa: Status 404 returned error can't find the container with id c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa Mar 13 10:41:16.288274 master-0 kubenswrapper[7508]: I0313 10:41:16.288213 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv"] Mar 13 10:41:16.291107 master-0 kubenswrapper[7508]: W0313 10:41:16.291034 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb460735c_56aa_4dd3_a756_759859083e12.slice/crio-0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513 WatchSource:0}: Error finding container 0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513: Status 404 returned error can't find the container with id 0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513 Mar 13 10:41:16.434811 master-0 kubenswrapper[7508]: I0313 10:41:16.434761 7508 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:41:16.878128 master-0 kubenswrapper[7508]: I0313 10:41:16.875683 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerStarted","Data":"e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb"} Mar 13 10:41:16.878128 master-0 kubenswrapper[7508]: I0313 10:41:16.875757 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerStarted","Data":"2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53"} Mar 13 10:41:16.878128 master-0 kubenswrapper[7508]: I0313 10:41:16.876893 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:16.885111 master-0 kubenswrapper[7508]: I0313 10:41:16.880945 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" event={"ID":"b460735c-56aa-4dd3-a756-759859083e12","Type":"ContainerStarted","Data":"4605ba5397add60b5787249493f28afddf74b60e5e4cff2c37fbf2c850052e1f"} Mar 13 10:41:16.885111 master-0 kubenswrapper[7508]: I0313 10:41:16.880981 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" event={"ID":"b460735c-56aa-4dd3-a756-759859083e12","Type":"ContainerStarted","Data":"0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513"} Mar 13 10:41:16.900607 master-0 kubenswrapper[7508]: I0313 10:41:16.896042 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" event={"ID":"258f571e-5ec8-42df-b4ba-17457d87d10d","Type":"ContainerStarted","Data":"c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa"} Mar 13 10:41:16.911789 master-0 kubenswrapper[7508]: I0313 10:41:16.909027 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" podStartSLOduration=1.909003824 podStartE2EDuration="1.909003824s" podCreationTimestamp="2026-03-13 10:41:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:16.9030755 +0000 UTC m=+315.645900617" watchObservedRunningTime="2026-03-13 10:41:16.909003824 +0000 UTC m=+315.651828951" Mar 13 10:41:16.939746 master-0 kubenswrapper[7508]: I0313 10:41:16.939600 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" podStartSLOduration=372.939577941 podStartE2EDuration="6m12.939577941s" podCreationTimestamp="2026-03-13 10:35:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:16.922619623 +0000 UTC m=+315.665444760" watchObservedRunningTime="2026-03-13 10:41:16.939577941 +0000 UTC m=+315.682403058" Mar 13 10:41:17.954196 master-0 kubenswrapper[7508]: I0313 10:41:17.952797 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:18.575641 master-0 kubenswrapper[7508]: I0313 10:41:18.575581 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7z94w"] Mar 13 10:41:18.866132 master-0 kubenswrapper[7508]: I0313 10:41:18.865774 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-zkmjs"] Mar 13 10:41:18.866601 master-0 kubenswrapper[7508]: I0313 10:41:18.866522 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:18.868887 master-0 kubenswrapper[7508]: I0313 10:41:18.868759 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 10:41:18.869166 master-0 kubenswrapper[7508]: I0313 10:41:18.869142 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-clxlg" Mar 13 10:41:18.869497 master-0 kubenswrapper[7508]: I0313 10:41:18.869462 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 10:41:19.043794 master-0 kubenswrapper[7508]: I0313 10:41:19.043751 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.044233 master-0 kubenswrapper[7508]: I0313 10:41:19.043885 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6smf\" (UniqueName: \"kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.044233 master-0 kubenswrapper[7508]: I0313 10:41:19.044044 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.146203 master-0 kubenswrapper[7508]: I0313 10:41:19.146153 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.146425 master-0 kubenswrapper[7508]: I0313 10:41:19.146255 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.146425 master-0 kubenswrapper[7508]: I0313 10:41:19.146279 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6smf\" (UniqueName: \"kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.150580 master-0 kubenswrapper[7508]: I0313 10:41:19.150526 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.153612 master-0 kubenswrapper[7508]: I0313 10:41:19.153575 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.168011 master-0 kubenswrapper[7508]: I0313 10:41:19.166511 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6smf\" (UniqueName: \"kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.195795 master-0 kubenswrapper[7508]: I0313 10:41:19.195732 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:19.941815 master-0 kubenswrapper[7508]: I0313 10:41:19.941740 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zkmjs" event={"ID":"161beda5-f575-4e60-8baa-5262a4fe86c7","Type":"ContainerStarted","Data":"5f85cde1e59c38c70f96b8d80a5986cf96d25b188ad7c135912463c3cc69c6c8"} Mar 13 10:41:19.941815 master-0 kubenswrapper[7508]: I0313 10:41:19.941802 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zkmjs" event={"ID":"161beda5-f575-4e60-8baa-5262a4fe86c7","Type":"ContainerStarted","Data":"1d85f90b35c0a6fe94e4911c5e6e2a9798938c9acd1504a9008825c00646ea44"} Mar 13 10:41:19.943992 master-0 kubenswrapper[7508]: I0313 10:41:19.943930 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" event={"ID":"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e","Type":"ContainerStarted","Data":"cf504ad2f3ecd51940abd8bf5bd673489b537e2883f503eb785901acbd1d1d46"} Mar 13 10:41:19.945634 master-0 kubenswrapper[7508]: I0313 10:41:19.945587 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" event={"ID":"258f571e-5ec8-42df-b4ba-17457d87d10d","Type":"ContainerStarted","Data":"ad7266ae43d7f039a194144e05ebf1043632f55b79736d2da49f78da98fb730b"} Mar 13 10:41:19.945779 master-0 kubenswrapper[7508]: I0313 10:41:19.945631 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" podUID="277614e8-838f-4773-bcfc-89f19c620dee" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" gracePeriod=30 Mar 13 10:41:19.946146 master-0 kubenswrapper[7508]: I0313 10:41:19.945973 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:19.955126 master-0 kubenswrapper[7508]: I0313 10:41:19.955045 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:20.002741 master-0 kubenswrapper[7508]: I0313 10:41:20.002164 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-zkmjs" podStartSLOduration=2.002136587 podStartE2EDuration="2.002136587s" podCreationTimestamp="2026-03-13 10:41:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:19.965746539 +0000 UTC m=+318.708571676" watchObservedRunningTime="2026-03-13 10:41:20.002136587 +0000 UTC m=+318.744961734" Mar 13 10:41:20.027623 master-0 kubenswrapper[7508]: I0313 10:41:20.027500 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podStartSLOduration=278.810732421 podStartE2EDuration="4m42.027461929s" podCreationTimestamp="2026-03-13 10:36:38 +0000 UTC" firstStartedPulling="2026-03-13 10:41:15.812328935 +0000 UTC m=+314.555154062" lastFinishedPulling="2026-03-13 10:41:19.029058453 +0000 UTC m=+317.771883570" observedRunningTime="2026-03-13 10:41:20.001728725 +0000 UTC m=+318.744553842" watchObservedRunningTime="2026-03-13 10:41:20.027461929 +0000 UTC m=+318.770287056" Mar 13 10:41:20.030348 master-0 kubenswrapper[7508]: I0313 10:41:20.028802 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" podStartSLOduration=264.054693745 podStartE2EDuration="4m27.028790688s" podCreationTimestamp="2026-03-13 10:36:53 +0000 UTC" firstStartedPulling="2026-03-13 10:41:16.016916544 +0000 UTC m=+314.759741671" lastFinishedPulling="2026-03-13 10:41:18.991013497 +0000 UTC m=+317.733838614" observedRunningTime="2026-03-13 10:41:20.023673518 +0000 UTC m=+318.766498645" watchObservedRunningTime="2026-03-13 10:41:20.028790688 +0000 UTC m=+318.771615805" Mar 13 10:41:20.538868 master-0 kubenswrapper[7508]: I0313 10:41:20.538787 7508 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 10:41:20.539641 master-0 kubenswrapper[7508]: I0313 10:41:20.539146 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://8f137541b8024be9dec3a0e2a3bb479dfd8210f470244154f734979cdb98e7ff" gracePeriod=30 Mar 13 10:41:20.540491 master-0 kubenswrapper[7508]: I0313 10:41:20.540197 7508 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:41:20.540491 master-0 kubenswrapper[7508]: E0313 10:41:20.540488 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.540786 master-0 kubenswrapper[7508]: I0313 10:41:20.540508 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.557829 master-0 kubenswrapper[7508]: I0313 10:41:20.547573 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.557829 master-0 kubenswrapper[7508]: I0313 10:41:20.547666 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.557829 master-0 kubenswrapper[7508]: E0313 10:41:20.548049 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.557829 master-0 kubenswrapper[7508]: I0313 10:41:20.548069 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:20.557829 master-0 kubenswrapper[7508]: I0313 10:41:20.553084 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.673533 master-0 kubenswrapper[7508]: I0313 10:41:20.673321 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.673533 master-0 kubenswrapper[7508]: I0313 10:41:20.673395 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.697699 master-0 kubenswrapper[7508]: I0313 10:41:20.697656 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:41:20.705664 master-0 kubenswrapper[7508]: I0313 10:41:20.705602 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:41:20.720691 master-0 kubenswrapper[7508]: I0313 10:41:20.720639 7508 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="f1b7a7db-b894-4f33-8bdb-3c0681908b8a" Mar 13 10:41:20.774329 master-0 kubenswrapper[7508]: I0313 10:41:20.774283 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.774442 master-0 kubenswrapper[7508]: I0313 10:41:20.774340 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.774442 master-0 kubenswrapper[7508]: I0313 10:41:20.774423 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.774564 master-0 kubenswrapper[7508]: I0313 10:41:20.774383 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:20.783064 master-0 kubenswrapper[7508]: I0313 10:41:20.783009 7508 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:20.785724 master-0 kubenswrapper[7508]: I0313 10:41:20.785683 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:20.785724 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:20.785724 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:20.785724 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:20.785900 master-0 kubenswrapper[7508]: I0313 10:41:20.785740 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:20.877526 master-0 kubenswrapper[7508]: I0313 10:41:20.877473 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 13 10:41:20.877743 master-0 kubenswrapper[7508]: I0313 10:41:20.877624 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 13 10:41:20.878090 master-0 kubenswrapper[7508]: I0313 10:41:20.877837 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:20.878090 master-0 kubenswrapper[7508]: I0313 10:41:20.877891 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:20.954922 master-0 kubenswrapper[7508]: I0313 10:41:20.954869 7508 generic.go:334] "Generic (PLEG): container finished" podID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerID="00a4f5e044b3bb37309a0058cc340985271f0a9be303d372e70635d4947090aa" exitCode=0 Mar 13 10:41:20.955206 master-0 kubenswrapper[7508]: I0313 10:41:20.954938 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerDied","Data":"00a4f5e044b3bb37309a0058cc340985271f0a9be303d372e70635d4947090aa"} Mar 13 10:41:20.960663 master-0 kubenswrapper[7508]: I0313 10:41:20.960593 7508 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="8f137541b8024be9dec3a0e2a3bb479dfd8210f470244154f734979cdb98e7ff" exitCode=0 Mar 13 10:41:20.961445 master-0 kubenswrapper[7508]: I0313 10:41:20.961409 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 13 10:41:20.964010 master-0 kubenswrapper[7508]: I0313 10:41:20.963194 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f" Mar 13 10:41:20.964010 master-0 kubenswrapper[7508]: I0313 10:41:20.963263 7508 scope.go:117] "RemoveContainer" containerID="c31330ee13d04180dffe9b5d1e1dc3fa90364bd389b7bdc31c0456dc4709e569" Mar 13 10:41:20.978864 master-0 kubenswrapper[7508]: I0313 10:41:20.978811 7508 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:20.978864 master-0 kubenswrapper[7508]: I0313 10:41:20.978854 7508 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:21.004262 master-0 kubenswrapper[7508]: I0313 10:41:21.003699 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:21.029894 master-0 kubenswrapper[7508]: W0313 10:41:21.029692 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d3d45b6ce1b3764f9927e623a71adf8.slice/crio-f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83 WatchSource:0}: Error finding container f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83: Status 404 returned error can't find the container with id f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83 Mar 13 10:41:21.513563 master-0 kubenswrapper[7508]: I0313 10:41:21.513390 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 13 10:41:21.513809 master-0 kubenswrapper[7508]: I0313 10:41:21.513771 7508 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 13 10:41:21.530504 master-0 kubenswrapper[7508]: I0313 10:41:21.530419 7508 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 10:41:21.530504 master-0 kubenswrapper[7508]: I0313 10:41:21.530490 7508 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="f1b7a7db-b894-4f33-8bdb-3c0681908b8a" Mar 13 10:41:21.534128 master-0 kubenswrapper[7508]: I0313 10:41:21.534037 7508 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 13 10:41:21.534254 master-0 kubenswrapper[7508]: I0313 10:41:21.534134 7508 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="f1b7a7db-b894-4f33-8bdb-3c0681908b8a" Mar 13 10:41:21.785906 master-0 kubenswrapper[7508]: I0313 10:41:21.785765 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:21.785906 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:21.785906 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:21.785906 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:21.785906 master-0 kubenswrapper[7508]: I0313 10:41:21.785857 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:21.968571 master-0 kubenswrapper[7508]: I0313 10:41:21.968526 7508 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" exitCode=0 Mar 13 10:41:21.968824 master-0 kubenswrapper[7508]: I0313 10:41:21.968646 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerDied","Data":"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709"} Mar 13 10:41:21.968824 master-0 kubenswrapper[7508]: I0313 10:41:21.968693 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83"} Mar 13 10:41:22.313048 master-0 kubenswrapper[7508]: I0313 10:41:22.312986 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:41:22.498481 master-0 kubenswrapper[7508]: I0313 10:41:22.498399 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir\") pod \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " Mar 13 10:41:22.498750 master-0 kubenswrapper[7508]: I0313 10:41:22.498547 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access\") pod \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " Mar 13 10:41:22.498750 master-0 kubenswrapper[7508]: I0313 10:41:22.498602 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock\") pod \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\" (UID: \"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b\") " Mar 13 10:41:22.498834 master-0 kubenswrapper[7508]: I0313 10:41:22.498741 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock" (OuterVolumeSpecName: "var-lock") pod "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" (UID: "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:22.498834 master-0 kubenswrapper[7508]: I0313 10:41:22.498742 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" (UID: "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:22.499378 master-0 kubenswrapper[7508]: I0313 10:41:22.499327 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:22.499378 master-0 kubenswrapper[7508]: I0313 10:41:22.499370 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:22.501422 master-0 kubenswrapper[7508]: I0313 10:41:22.501360 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" (UID: "a7c07c6e-447f-4111-9d5a-b848fc3e1b2b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:22.600437 master-0 kubenswrapper[7508]: I0313 10:41:22.600360 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a7c07c6e-447f-4111-9d5a-b848fc3e1b2b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:22.785854 master-0 kubenswrapper[7508]: I0313 10:41:22.785667 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:22.785854 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:22.785854 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:22.785854 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:22.785854 master-0 kubenswrapper[7508]: I0313 10:41:22.785755 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:22.981688 master-0 kubenswrapper[7508]: I0313 10:41:22.981612 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerDied","Data":"1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406"} Mar 13 10:41:22.981688 master-0 kubenswrapper[7508]: I0313 10:41:22.981672 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406" Mar 13 10:41:22.981688 master-0 kubenswrapper[7508]: I0313 10:41:22.981690 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:41:22.984503 master-0 kubenswrapper[7508]: I0313 10:41:22.984455 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512"} Mar 13 10:41:22.984503 master-0 kubenswrapper[7508]: I0313 10:41:22.984499 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d"} Mar 13 10:41:23.018649 master-0 kubenswrapper[7508]: I0313 10:41:23.018545 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc"] Mar 13 10:41:23.019019 master-0 kubenswrapper[7508]: E0313 10:41:23.018927 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:23.019019 master-0 kubenswrapper[7508]: I0313 10:41:23.018961 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:23.019265 master-0 kubenswrapper[7508]: I0313 10:41:23.019198 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:23.029143 master-0 kubenswrapper[7508]: I0313 10:41:23.021044 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.039166 master-0 kubenswrapper[7508]: I0313 10:41:23.038989 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 10:41:23.039409 master-0 kubenswrapper[7508]: I0313 10:41:23.038989 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2p4lb" Mar 13 10:41:23.041452 master-0 kubenswrapper[7508]: I0313 10:41:23.041391 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 10:41:23.049638 master-0 kubenswrapper[7508]: I0313 10:41:23.049566 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 10:41:23.069131 master-0 kubenswrapper[7508]: I0313 10:41:23.065864 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc"] Mar 13 10:41:23.118793 master-0 kubenswrapper[7508]: I0313 10:41:23.118722 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.118793 master-0 kubenswrapper[7508]: I0313 10:41:23.118807 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.119084 master-0 kubenswrapper[7508]: I0313 10:41:23.118866 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.119084 master-0 kubenswrapper[7508]: I0313 10:41:23.118897 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws7gk\" (UniqueName: \"kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.221483 master-0 kubenswrapper[7508]: I0313 10:41:23.221411 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.221614 master-0 kubenswrapper[7508]: I0313 10:41:23.221554 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7gk\" (UniqueName: \"kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.221614 master-0 kubenswrapper[7508]: I0313 10:41:23.221599 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.221692 master-0 kubenswrapper[7508]: I0313 10:41:23.221652 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.222932 master-0 kubenswrapper[7508]: I0313 10:41:23.222909 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.225871 master-0 kubenswrapper[7508]: I0313 10:41:23.225596 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.227857 master-0 kubenswrapper[7508]: I0313 10:41:23.227744 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.240384 master-0 kubenswrapper[7508]: I0313 10:41:23.240351 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7gk\" (UniqueName: \"kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.385226 master-0 kubenswrapper[7508]: I0313 10:41:23.385178 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:23.786524 master-0 kubenswrapper[7508]: I0313 10:41:23.785916 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:23.786524 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:23.786524 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:23.786524 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:23.786524 master-0 kubenswrapper[7508]: I0313 10:41:23.786184 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:23.872070 master-0 kubenswrapper[7508]: I0313 10:41:23.872016 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc"] Mar 13 10:41:23.874823 master-0 kubenswrapper[7508]: W0313 10:41:23.874757 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7748068f_7409_4972_81d2_84cfb52b7af0.slice/crio-907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a WatchSource:0}: Error finding container 907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a: Status 404 returned error can't find the container with id 907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a Mar 13 10:41:23.993032 master-0 kubenswrapper[7508]: I0313 10:41:23.992970 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a"} Mar 13 10:41:23.995949 master-0 kubenswrapper[7508]: I0313 10:41:23.995901 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7"} Mar 13 10:41:23.996194 master-0 kubenswrapper[7508]: I0313 10:41:23.996170 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:24.027013 master-0 kubenswrapper[7508]: I0313 10:41:24.026908 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=4.026881808 podStartE2EDuration="4.026881808s" podCreationTimestamp="2026-03-13 10:41:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:24.02182721 +0000 UTC m=+322.764652347" watchObservedRunningTime="2026-03-13 10:41:24.026881808 +0000 UTC m=+322.769706935" Mar 13 10:41:24.785844 master-0 kubenswrapper[7508]: I0313 10:41:24.785751 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:24.785844 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:24.785844 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:24.785844 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:24.786294 master-0 kubenswrapper[7508]: I0313 10:41:24.785853 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:25.783131 master-0 kubenswrapper[7508]: I0313 10:41:25.783058 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:25.784926 master-0 kubenswrapper[7508]: I0313 10:41:25.784888 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:25.784926 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:25.784926 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:25.784926 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:25.785184 master-0 kubenswrapper[7508]: I0313 10:41:25.784941 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:25.831508 master-0 kubenswrapper[7508]: E0313 10:41:25.831430 7508 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 13 10:41:25.832961 master-0 kubenswrapper[7508]: E0313 10:41:25.832896 7508 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 13 10:41:25.835004 master-0 kubenswrapper[7508]: E0313 10:41:25.834952 7508 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 13 10:41:25.835119 master-0 kubenswrapper[7508]: E0313 10:41:25.835009 7508 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" podUID="277614e8-838f-4773-bcfc-89f19c620dee" containerName="kube-multus-additional-cni-plugins" Mar 13 10:41:26.010768 master-0 kubenswrapper[7508]: I0313 10:41:26.010703 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"70ef4c5f1d692f58502f8e513680c34b7093d5497ebb044ab29ea9dcc18a1719"} Mar 13 10:41:26.010768 master-0 kubenswrapper[7508]: I0313 10:41:26.010750 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"c1df94cdd30ef8f9eee6f877acb1f8a1552be4430e802f50d22a9879330a2fc9"} Mar 13 10:41:26.027959 master-0 kubenswrapper[7508]: I0313 10:41:26.027865 7508 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" podStartSLOduration=4.356866254 podStartE2EDuration="6.027841964s" podCreationTimestamp="2026-03-13 10:41:20 +0000 UTC" firstStartedPulling="2026-03-13 10:41:23.877143827 +0000 UTC m=+322.619968944" lastFinishedPulling="2026-03-13 10:41:25.548119537 +0000 UTC m=+324.290944654" observedRunningTime="2026-03-13 10:41:26.025370742 +0000 UTC m=+324.768195859" watchObservedRunningTime="2026-03-13 10:41:26.027841964 +0000 UTC m=+324.770667081" Mar 13 10:41:26.785955 master-0 kubenswrapper[7508]: I0313 10:41:26.785868 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:26.785955 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:26.785955 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:26.785955 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:26.786807 master-0 kubenswrapper[7508]: I0313 10:41:26.785964 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:27.784530 master-0 kubenswrapper[7508]: I0313 10:41:27.784458 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:27.784530 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:27.784530 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:27.784530 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:27.784966 master-0 kubenswrapper[7508]: I0313 10:41:27.784535 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:28.213538 master-0 kubenswrapper[7508]: I0313 10:41:28.213458 7508 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:41:28.214436 master-0 kubenswrapper[7508]: I0313 10:41:28.214406 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.217865 master-0 kubenswrapper[7508]: I0313 10:41:28.217817 7508 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 10:41:28.217865 master-0 kubenswrapper[7508]: I0313 10:41:28.217875 7508 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:41:28.218126 master-0 kubenswrapper[7508]: E0313 10:41:28.218083 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:28.218126 master-0 kubenswrapper[7508]: I0313 10:41:28.218122 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:28.218260 master-0 kubenswrapper[7508]: E0313 10:41:28.218137 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:28.218260 master-0 kubenswrapper[7508]: I0313 10:41:28.218145 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:28.218260 master-0 kubenswrapper[7508]: E0313 10:41:28.218175 7508 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:28.218260 master-0 kubenswrapper[7508]: I0313 10:41:28.218182 7508 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:28.218454 master-0 kubenswrapper[7508]: I0313 10:41:28.218319 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:28.218454 master-0 kubenswrapper[7508]: I0313 10:41:28.218345 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:28.218454 master-0 kubenswrapper[7508]: I0313 10:41:28.218375 7508 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:28.220112 master-0 kubenswrapper[7508]: I0313 10:41:28.220061 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.220590 master-0 kubenswrapper[7508]: I0313 10:41:28.220532 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba" gracePeriod=15 Mar 13 10:41:28.220696 master-0 kubenswrapper[7508]: I0313 10:41:28.220606 7508 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d" gracePeriod=15 Mar 13 10:41:28.254230 master-0 kubenswrapper[7508]: I0313 10:41:28.254172 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6"] Mar 13 10:41:28.255390 master-0 kubenswrapper[7508]: I0313 10:41:28.255371 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.257217 master-0 kubenswrapper[7508]: I0313 10:41:28.257172 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 10:41:28.257427 master-0 kubenswrapper[7508]: I0313 10:41:28.257405 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-c2nqj" Mar 13 10:41:28.257728 master-0 kubenswrapper[7508]: I0313 10:41:28.257703 7508 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 10:41:28.258102 master-0 kubenswrapper[7508]: I0313 10:41:28.258069 7508 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 10:41:28.267550 master-0 kubenswrapper[7508]: I0313 10:41:28.267502 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:41:28.274122 master-0 kubenswrapper[7508]: I0313 10:41:28.274051 7508 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6"] Mar 13 10:41:28.294966 master-0 kubenswrapper[7508]: I0313 10:41:28.294905 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.294966 master-0 kubenswrapper[7508]: I0313 10:41:28.294956 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.294987 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295006 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295030 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295049 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295072 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295116 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295132 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295153 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295183 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295203 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295227 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.295240 master-0 kubenswrapper[7508]: I0313 10:41:28.295245 7508 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.301657 master-0 kubenswrapper[7508]: I0313 10:41:28.297332 7508 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:41:28.395674 master-0 kubenswrapper[7508]: I0313 10:41:28.395601 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.395905 master-0 kubenswrapper[7508]: I0313 10:41:28.395859 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.395945 master-0 kubenswrapper[7508]: I0313 10:41:28.395861 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.395945 master-0 kubenswrapper[7508]: I0313 10:41:28.395894 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.395945 master-0 kubenswrapper[7508]: I0313 10:41:28.395936 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.396042 master-0 kubenswrapper[7508]: I0313 10:41:28.396007 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396113 master-0 kubenswrapper[7508]: I0313 10:41:28.396062 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.396152 master-0 kubenswrapper[7508]: I0313 10:41:28.396017 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.396152 master-0 kubenswrapper[7508]: I0313 10:41:28.396145 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396237 master-0 kubenswrapper[7508]: I0313 10:41:28.396218 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396289 master-0 kubenswrapper[7508]: I0313 10:41:28.396250 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.396289 master-0 kubenswrapper[7508]: I0313 10:41:28.396275 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396396 master-0 kubenswrapper[7508]: I0313 10:41:28.396374 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396442 master-0 kubenswrapper[7508]: I0313 10:41:28.396403 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.396442 master-0 kubenswrapper[7508]: I0313 10:41:28.396436 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.396560 master-0 kubenswrapper[7508]: I0313 10:41:28.396529 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.396943 master-0 kubenswrapper[7508]: I0313 10:41:28.396908 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.397003 master-0 kubenswrapper[7508]: E0313 10:41:28.396954 7508 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 13 10:41:28.397331 master-0 kubenswrapper[7508]: I0313 10:41:28.397309 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.397387 master-0 kubenswrapper[7508]: I0313 10:41:28.397336 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.397387 master-0 kubenswrapper[7508]: I0313 10:41:28.397364 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.397387 master-0 kubenswrapper[7508]: I0313 10:41:28.397367 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.397508 master-0 kubenswrapper[7508]: E0313 10:41:28.397394 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:28.897317926 +0000 UTC m=+327.640143043 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : secret "kube-state-metrics-tls" not found Mar 13 10:41:28.397508 master-0 kubenswrapper[7508]: I0313 10:41:28.397433 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.397508 master-0 kubenswrapper[7508]: I0313 10:41:28.397455 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.397508 master-0 kubenswrapper[7508]: I0313 10:41:28.397498 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.397665 master-0 kubenswrapper[7508]: I0313 10:41:28.397568 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.398008 master-0 kubenswrapper[7508]: E0313 10:41:28.397977 7508 projected.go:194] Error preparing data for projected volume kube-api-access-txw7p for pod openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/kube-state-metrics/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:41:28.398085 master-0 kubenswrapper[7508]: E0313 10:41:28.398049 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:28.898034747 +0000 UTC m=+327.640859944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-txw7p" (UniqueName: "kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/kube-state-metrics/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:41:28.398252 master-0 kubenswrapper[7508]: E0313 10:41:28.398076 7508 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-state-metrics-68b88f8cb5-dw9w6.189c60878222a11d openshift-monitoring 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-monitoring,Name:kube-state-metrics-68b88f8cb5-dw9w6,UID:7c5279e3-0165-4347-bfc7-87b80accaab3,APIVersion:v1,ResourceVersion:11218,FieldPath:,},Reason:FailedMount,Message:MountVolume.SetUp failed for volume \"kube-state-metrics-tls\" : secret \"kube-state-metrics-tls\" not found,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:41:28.397291805 +0000 UTC m=+327.140116942,LastTimestamp:2026-03-13 10:41:28.397291805 +0000 UTC m=+327.140116942,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:41:28.409423 master-0 kubenswrapper[7508]: I0313 10:41:28.409385 7508 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.565115 master-0 kubenswrapper[7508]: I0313 10:41:28.564935 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:28.592124 master-0 kubenswrapper[7508]: W0313 10:41:28.592047 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacbb43bf2cf27ed60d1f635fd6638ac7.slice/crio-32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65 WatchSource:0}: Error finding container 32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65: Status 404 returned error can't find the container with id 32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65 Mar 13 10:41:28.594298 master-0 kubenswrapper[7508]: I0313 10:41:28.594248 7508 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:28.624438 master-0 kubenswrapper[7508]: W0313 10:41:28.624293 7508 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c3280e9367536f782caf8bdc07edb85.slice/crio-a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff WatchSource:0}: Error finding container a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff: Status 404 returned error can't find the container with id a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff Mar 13 10:41:28.791782 master-0 kubenswrapper[7508]: I0313 10:41:28.791717 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:28.791782 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:28.791782 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:28.791782 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:28.792143 master-0 kubenswrapper[7508]: I0313 10:41:28.791809 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:28.903876 master-0 kubenswrapper[7508]: I0313 10:41:28.903818 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.904156 master-0 kubenswrapper[7508]: I0313 10:41:28.903901 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:28.904156 master-0 kubenswrapper[7508]: E0313 10:41:28.904023 7508 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 13 10:41:28.904156 master-0 kubenswrapper[7508]: E0313 10:41:28.904078 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:29.904061296 +0000 UTC m=+328.646886413 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : secret "kube-state-metrics-tls" not found Mar 13 10:41:28.905038 master-0 kubenswrapper[7508]: E0313 10:41:28.905002 7508 projected.go:194] Error preparing data for projected volume kube-api-access-txw7p for pod openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6: failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/kube-state-metrics/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:41:28.905161 master-0 kubenswrapper[7508]: E0313 10:41:28.905141 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:29.905090866 +0000 UTC m=+328.647915993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-txw7p" (UniqueName: "kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to fetch token: Post "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/kube-state-metrics/token": dial tcp 192.168.32.10:6443: connect: connection refused Mar 13 10:41:29.040546 master-0 kubenswrapper[7508]: I0313 10:41:29.040475 7508 generic.go:334] "Generic (PLEG): container finished" podID="3b44838d-cfe0-42fe-9927-d0b5391eee81" containerID="4f57dbde7e6dd83a3f45d28b694622a3cd36e451a3d2e531b974cdf91eee3a45" exitCode=0 Mar 13 10:41:29.040780 master-0 kubenswrapper[7508]: I0313 10:41:29.040561 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerDied","Data":"4f57dbde7e6dd83a3f45d28b694622a3cd36e451a3d2e531b974cdf91eee3a45"} Mar 13 10:41:29.042111 master-0 kubenswrapper[7508]: I0313 10:41:29.042040 7508 status_manager.go:851] "Failed to get status for pod" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.042352 master-0 kubenswrapper[7508]: I0313 10:41:29.042318 7508 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d" exitCode=0 Mar 13 10:41:29.042532 master-0 kubenswrapper[7508]: I0313 10:41:29.042444 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerDied","Data":"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d"} Mar 13 10:41:29.042532 master-0 kubenswrapper[7508]: I0313 10:41:29.042504 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff"} Mar 13 10:41:29.042895 master-0 kubenswrapper[7508]: I0313 10:41:29.042816 7508 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.044162 master-0 kubenswrapper[7508]: I0313 10:41:29.044071 7508 status_manager.go:851] "Failed to get status for pod" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.044876 master-0 kubenswrapper[7508]: I0313 10:41:29.044811 7508 status_manager.go:851] "Failed to get status for pod" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.045606 master-0 kubenswrapper[7508]: I0313 10:41:29.045562 7508 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.046210 master-0 kubenswrapper[7508]: I0313 10:41:29.046163 7508 status_manager.go:851] "Failed to get status for pod" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.047398 master-0 kubenswrapper[7508]: I0313 10:41:29.047370 7508 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d" exitCode=0 Mar 13 10:41:29.049408 master-0 kubenswrapper[7508]: I0313 10:41:29.049359 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"acbb43bf2cf27ed60d1f635fd6638ac7","Type":"ContainerStarted","Data":"d472edafe5d759160888ed04e3afd874976a6f531f1a77a132237f479f6f2ec3"} Mar 13 10:41:29.049509 master-0 kubenswrapper[7508]: I0313 10:41:29.049450 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"acbb43bf2cf27ed60d1f635fd6638ac7","Type":"ContainerStarted","Data":"32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65"} Mar 13 10:41:29.050385 master-0 kubenswrapper[7508]: I0313 10:41:29.050343 7508 status_manager.go:851] "Failed to get status for pod" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.050899 master-0 kubenswrapper[7508]: I0313 10:41:29.050857 7508 status_manager.go:851] "Failed to get status for pod" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.051388 master-0 kubenswrapper[7508]: I0313 10:41:29.051353 7508 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:41:29.785856 master-0 kubenswrapper[7508]: I0313 10:41:29.785734 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:29.785856 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:29.785856 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:29.785856 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:29.785856 master-0 kubenswrapper[7508]: I0313 10:41:29.785808 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:29.972934 master-0 kubenswrapper[7508]: I0313 10:41:29.972876 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:29.973647 master-0 kubenswrapper[7508]: I0313 10:41:29.973411 7508 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:29.973647 master-0 kubenswrapper[7508]: E0313 10:41:29.973575 7508 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Mar 13 10:41:29.973647 master-0 kubenswrapper[7508]: E0313 10:41:29.973628 7508 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:31.973612649 +0000 UTC m=+330.716437766 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : secret "kube-state-metrics-tls" not found Mar 13 10:41:30.058569 master-0 kubenswrapper[7508]: I0313 10:41:30.058522 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2"} Mar 13 10:41:30.058679 master-0 kubenswrapper[7508]: I0313 10:41:30.058597 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6"} Mar 13 10:41:30.058679 master-0 kubenswrapper[7508]: I0313 10:41:30.058612 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f"} Mar 13 10:41:30.471138 master-0 kubenswrapper[7508]: I0313 10:41:30.469469 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:30.479601 master-0 kubenswrapper[7508]: I0313 10:41:30.479545 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:41:30.479876 master-0 kubenswrapper[7508]: I0313 10:41:30.479860 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:41:30.480010 master-0 kubenswrapper[7508]: I0313 10:41:30.479998 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:41:30.480424 master-0 kubenswrapper[7508]: I0313 10:41:30.480371 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:30.480643 master-0 kubenswrapper[7508]: I0313 10:41:30.480620 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock" (OuterVolumeSpecName: "var-lock") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:30.495861 master-0 kubenswrapper[7508]: I0313 10:41:30.495803 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:30.582056 master-0 kubenswrapper[7508]: I0313 10:41:30.581979 7508 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:30.582056 master-0 kubenswrapper[7508]: I0313 10:41:30.582033 7508 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:30.582056 master-0 kubenswrapper[7508]: I0313 10:41:30.582048 7508 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:30.880845 master-0 kubenswrapper[7508]: I0313 10:41:30.878952 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:30.880845 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:30.880845 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:30.880845 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:30.880845 master-0 kubenswrapper[7508]: I0313 10:41:30.879092 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:31.068014 master-0 kubenswrapper[7508]: I0313 10:41:31.067943 7508 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba" exitCode=0 Mar 13 10:41:31.068283 master-0 kubenswrapper[7508]: I0313 10:41:31.068075 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0" Mar 13 10:41:31.069704 master-0 kubenswrapper[7508]: I0313 10:41:31.069675 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerDied","Data":"d4e74163544c10bf31d045c60068db268de2c869878f5f7b983afe24046cf63d"} Mar 13 10:41:31.069844 master-0 kubenswrapper[7508]: I0313 10:41:31.069824 7508 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e74163544c10bf31d045c60068db268de2c869878f5f7b983afe24046cf63d" Mar 13 10:41:31.069967 master-0 kubenswrapper[7508]: I0313 10:41:31.069748 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:31.072670 master-0 kubenswrapper[7508]: I0313 10:41:31.072629 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc"} Mar 13 10:41:31.072670 master-0 kubenswrapper[7508]: I0313 10:41:31.072671 7508 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12"} Mar 13 10:41:31.072912 master-0 kubenswrapper[7508]: I0313 10:41:31.072887 7508 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:31.108350 master-0 kubenswrapper[7508]: I0313 10:41:31.108322 7508 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272356 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272425 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272456 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272492 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272518 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272542 7508 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272777 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272857 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272873 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272886 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272894 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.272926 7508 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273658 7508 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273676 7508 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273685 7508 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273696 7508 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273705 7508 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.276203 master-0 kubenswrapper[7508]: I0313 10:41:31.273713 7508 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: I0313 10:41:31.787063 7508 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: I0313 10:41:31.788826 7508 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: I0313 10:41:31.795653 7508 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-mbkch container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: [-]has-synced failed: reason withheld Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: [+]process-running ok Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: healthz check failed Mar 13 10:41:31.798459 master-0 kubenswrapper[7508]: I0313 10:41:31.795758 7508 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" podUID="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:41:31.961876 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 13 10:41:31.962562 master-0 kubenswrapper[7508]: I0313 10:41:31.962251 7508 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:41:32.005764 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 13 10:41:32.006143 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 13 10:41:32.010533 master-0 systemd[1]: kubelet.service: Consumed 54.169s CPU time. Mar 13 10:41:32.065077 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 13 10:41:32.216988 master-0 kubenswrapper[17876]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 13 10:41:32.218110 master-0 kubenswrapper[17876]: I0313 10:41:32.217140 17876 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222669 17876 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222693 17876 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222698 17876 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222702 17876 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222706 17876 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222709 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222713 17876 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222717 17876 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222720 17876 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222724 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222727 17876 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222733 17876 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222738 17876 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222742 17876 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222745 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222749 17876 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222752 17876 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222756 17876 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222760 17876 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:41:32.223006 master-0 kubenswrapper[17876]: W0313 10:41:32.222764 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222767 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222771 17876 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222775 17876 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222778 17876 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222783 17876 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222787 17876 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222791 17876 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222795 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222799 17876 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222804 17876 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222809 17876 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222813 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222817 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222831 17876 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222839 17876 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222846 17876 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222855 17876 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222859 17876 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:41:32.223896 master-0 kubenswrapper[17876]: W0313 10:41:32.222864 17876 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222868 17876 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222872 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222876 17876 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222879 17876 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222883 17876 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222887 17876 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222890 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222894 17876 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222898 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222902 17876 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222906 17876 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222910 17876 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222913 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222917 17876 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222921 17876 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222924 17876 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222928 17876 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222931 17876 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222935 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:41:32.224533 master-0 kubenswrapper[17876]: W0313 10:41:32.222939 17876 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222943 17876 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222947 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222951 17876 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222954 17876 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222958 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222962 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222976 17876 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222984 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222988 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222992 17876 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222996 17876 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.222999 17876 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: W0313 10:41:32.223003 17876 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223140 17876 flags.go:64] FLAG: --address="0.0.0.0" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223156 17876 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223165 17876 flags.go:64] FLAG: --anonymous-auth="true" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223171 17876 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223176 17876 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223181 17876 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223186 17876 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 13 10:41:32.225165 master-0 kubenswrapper[17876]: I0313 10:41:32.223192 17876 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223197 17876 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223203 17876 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223211 17876 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223216 17876 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223222 17876 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223228 17876 flags.go:64] FLAG: --cgroup-root="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223233 17876 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223239 17876 flags.go:64] FLAG: --client-ca-file="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223244 17876 flags.go:64] FLAG: --cloud-config="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223249 17876 flags.go:64] FLAG: --cloud-provider="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223254 17876 flags.go:64] FLAG: --cluster-dns="[]" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223262 17876 flags.go:64] FLAG: --cluster-domain="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223267 17876 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223272 17876 flags.go:64] FLAG: --config-dir="" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223277 17876 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223283 17876 flags.go:64] FLAG: --container-log-max-files="5" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223294 17876 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223328 17876 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223334 17876 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223339 17876 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223345 17876 flags.go:64] FLAG: --contention-profiling="false" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223350 17876 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223355 17876 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223361 17876 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 13 10:41:32.225971 master-0 kubenswrapper[17876]: I0313 10:41:32.223367 17876 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223374 17876 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223380 17876 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223385 17876 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223390 17876 flags.go:64] FLAG: --enable-load-reader="false" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223396 17876 flags.go:64] FLAG: --enable-server="true" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223401 17876 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223407 17876 flags.go:64] FLAG: --event-burst="100" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223413 17876 flags.go:64] FLAG: --event-qps="50" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223418 17876 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223424 17876 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223429 17876 flags.go:64] FLAG: --eviction-hard="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223435 17876 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223441 17876 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223447 17876 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223453 17876 flags.go:64] FLAG: --eviction-soft="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223458 17876 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223463 17876 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223469 17876 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223474 17876 flags.go:64] FLAG: --experimental-mounter-path="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223478 17876 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223483 17876 flags.go:64] FLAG: --fail-swap-on="true" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223487 17876 flags.go:64] FLAG: --feature-gates="" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223492 17876 flags.go:64] FLAG: --file-check-frequency="20s" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223497 17876 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 13 10:41:32.227001 master-0 kubenswrapper[17876]: I0313 10:41:32.223504 17876 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223508 17876 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223513 17876 flags.go:64] FLAG: --healthz-port="10248" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223517 17876 flags.go:64] FLAG: --help="false" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223522 17876 flags.go:64] FLAG: --hostname-override="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223527 17876 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223532 17876 flags.go:64] FLAG: --http-check-frequency="20s" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223537 17876 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223542 17876 flags.go:64] FLAG: --image-credential-provider-config="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223547 17876 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223552 17876 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223558 17876 flags.go:64] FLAG: --image-service-endpoint="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223563 17876 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223568 17876 flags.go:64] FLAG: --kube-api-burst="100" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223574 17876 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223580 17876 flags.go:64] FLAG: --kube-api-qps="50" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223585 17876 flags.go:64] FLAG: --kube-reserved="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223590 17876 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223597 17876 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223603 17876 flags.go:64] FLAG: --kubelet-cgroups="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223608 17876 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223614 17876 flags.go:64] FLAG: --lock-file="" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223619 17876 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223624 17876 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223629 17876 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223645 17876 flags.go:64] FLAG: --log-json-split-stream="false" Mar 13 10:41:32.227943 master-0 kubenswrapper[17876]: I0313 10:41:32.223651 17876 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223656 17876 flags.go:64] FLAG: --log-text-split-stream="false" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223661 17876 flags.go:64] FLAG: --logging-format="text" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223665 17876 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223670 17876 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223674 17876 flags.go:64] FLAG: --manifest-url="" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223681 17876 flags.go:64] FLAG: --manifest-url-header="" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223687 17876 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223691 17876 flags.go:64] FLAG: --max-open-files="1000000" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223697 17876 flags.go:64] FLAG: --max-pods="110" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223701 17876 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223706 17876 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223710 17876 flags.go:64] FLAG: --memory-manager-policy="None" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223714 17876 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223718 17876 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223722 17876 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223726 17876 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223737 17876 flags.go:64] FLAG: --node-status-max-images="50" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223741 17876 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223746 17876 flags.go:64] FLAG: --oom-score-adj="-999" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223750 17876 flags.go:64] FLAG: --pod-cidr="" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223754 17876 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223762 17876 flags.go:64] FLAG: --pod-manifest-path="" Mar 13 10:41:32.228714 master-0 kubenswrapper[17876]: I0313 10:41:32.223766 17876 flags.go:64] FLAG: --pod-max-pids="-1" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223771 17876 flags.go:64] FLAG: --pods-per-core="0" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223775 17876 flags.go:64] FLAG: --port="10250" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223780 17876 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223784 17876 flags.go:64] FLAG: --provider-id="" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223788 17876 flags.go:64] FLAG: --qos-reserved="" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223792 17876 flags.go:64] FLAG: --read-only-port="10255" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223796 17876 flags.go:64] FLAG: --register-node="true" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223801 17876 flags.go:64] FLAG: --register-schedulable="true" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223805 17876 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223819 17876 flags.go:64] FLAG: --registry-burst="10" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223823 17876 flags.go:64] FLAG: --registry-qps="5" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223829 17876 flags.go:64] FLAG: --reserved-cpus="" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223834 17876 flags.go:64] FLAG: --reserved-memory="" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223840 17876 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223845 17876 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223853 17876 flags.go:64] FLAG: --rotate-certificates="false" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223859 17876 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223863 17876 flags.go:64] FLAG: --runonce="false" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223869 17876 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223874 17876 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223880 17876 flags.go:64] FLAG: --seccomp-default="false" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223885 17876 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223890 17876 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223896 17876 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223902 17876 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 13 10:41:32.229528 master-0 kubenswrapper[17876]: I0313 10:41:32.223908 17876 flags.go:64] FLAG: --storage-driver-password="root" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223915 17876 flags.go:64] FLAG: --storage-driver-secure="false" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223921 17876 flags.go:64] FLAG: --storage-driver-table="stats" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223927 17876 flags.go:64] FLAG: --storage-driver-user="root" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223932 17876 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223937 17876 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223943 17876 flags.go:64] FLAG: --system-cgroups="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223949 17876 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223958 17876 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223963 17876 flags.go:64] FLAG: --tls-cert-file="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223968 17876 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223976 17876 flags.go:64] FLAG: --tls-min-version="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223981 17876 flags.go:64] FLAG: --tls-private-key-file="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223986 17876 flags.go:64] FLAG: --topology-manager-policy="none" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223991 17876 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.223995 17876 flags.go:64] FLAG: --topology-manager-scope="container" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.224000 17876 flags.go:64] FLAG: --v="2" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.224007 17876 flags.go:64] FLAG: --version="false" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.224014 17876 flags.go:64] FLAG: --vmodule="" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.224020 17876 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: I0313 10:41:32.224026 17876 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: W0313 10:41:32.224170 17876 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: W0313 10:41:32.224183 17876 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: W0313 10:41:32.224188 17876 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:41:32.230354 master-0 kubenswrapper[17876]: W0313 10:41:32.224193 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224197 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224202 17876 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224208 17876 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224213 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224218 17876 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224222 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224226 17876 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224231 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224236 17876 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224240 17876 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224245 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224250 17876 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224254 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224259 17876 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224264 17876 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224269 17876 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224273 17876 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224277 17876 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224282 17876 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:41:32.231044 master-0 kubenswrapper[17876]: W0313 10:41:32.224287 17876 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224291 17876 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224296 17876 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224302 17876 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224309 17876 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224314 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224320 17876 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224326 17876 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224331 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224337 17876 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224343 17876 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224346 17876 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224350 17876 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224354 17876 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224359 17876 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224362 17876 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224366 17876 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224370 17876 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224374 17876 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:41:32.231661 master-0 kubenswrapper[17876]: W0313 10:41:32.224378 17876 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224382 17876 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224387 17876 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224391 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224395 17876 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224400 17876 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224404 17876 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224409 17876 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224413 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224417 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224422 17876 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224427 17876 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224431 17876 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224435 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224440 17876 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224445 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224450 17876 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224454 17876 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224458 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224462 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:41:32.232296 master-0 kubenswrapper[17876]: W0313 10:41:32.224468 17876 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224474 17876 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224479 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224487 17876 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224491 17876 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224496 17876 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224505 17876 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224509 17876 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224515 17876 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.224521 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: I0313 10:41:32.224548 17876 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: I0313 10:41:32.229312 17876 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: I0313 10:41:32.229347 17876 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.229430 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.229440 17876 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:41:32.232963 master-0 kubenswrapper[17876]: W0313 10:41:32.229445 17876 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229476 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229483 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229490 17876 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229545 17876 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229555 17876 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229560 17876 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229565 17876 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229569 17876 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229574 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229580 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229584 17876 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229590 17876 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229596 17876 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229627 17876 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229634 17876 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229641 17876 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229647 17876 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229653 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:41:32.233443 master-0 kubenswrapper[17876]: W0313 10:41:32.229658 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229664 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229702 17876 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229708 17876 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229724 17876 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229729 17876 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229735 17876 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229747 17876 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229752 17876 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229758 17876 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229763 17876 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229769 17876 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229774 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229781 17876 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229787 17876 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229791 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229796 17876 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229801 17876 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229806 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229811 17876 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:41:32.234105 master-0 kubenswrapper[17876]: W0313 10:41:32.229816 17876 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229822 17876 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229828 17876 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229833 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229838 17876 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229843 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229847 17876 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229852 17876 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229856 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229862 17876 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229866 17876 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229871 17876 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229875 17876 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229889 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229893 17876 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229898 17876 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229902 17876 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229907 17876 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229911 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:41:32.234922 master-0 kubenswrapper[17876]: W0313 10:41:32.229916 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229921 17876 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229925 17876 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229932 17876 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229936 17876 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229941 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229946 17876 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229951 17876 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229955 17876 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229961 17876 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229968 17876 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.229974 17876 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: I0313 10:41:32.229982 17876 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.230157 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.230171 17876 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 13 10:41:32.235790 master-0 kubenswrapper[17876]: W0313 10:41:32.230176 17876 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230182 17876 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230188 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230194 17876 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230201 17876 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230206 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230211 17876 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230216 17876 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230221 17876 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230226 17876 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230230 17876 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230235 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230240 17876 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230244 17876 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230249 17876 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230254 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230259 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230264 17876 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230269 17876 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230274 17876 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 13 10:41:32.236286 master-0 kubenswrapper[17876]: W0313 10:41:32.230290 17876 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230295 17876 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230300 17876 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230311 17876 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230316 17876 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230321 17876 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230326 17876 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230333 17876 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230340 17876 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230345 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230350 17876 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230356 17876 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230364 17876 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230370 17876 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230375 17876 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230380 17876 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230386 17876 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230401 17876 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230411 17876 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230415 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 13 10:41:32.236923 master-0 kubenswrapper[17876]: W0313 10:41:32.230420 17876 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230424 17876 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230429 17876 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230439 17876 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230443 17876 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230448 17876 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230453 17876 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230458 17876 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230462 17876 feature_gate.go:330] unrecognized feature gate: Example Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230467 17876 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230479 17876 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230484 17876 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230494 17876 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230499 17876 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230504 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230508 17876 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230513 17876 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230518 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230523 17876 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230527 17876 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 13 10:41:32.237498 master-0 kubenswrapper[17876]: W0313 10:41:32.230532 17876 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230537 17876 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230542 17876 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230546 17876 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230553 17876 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230559 17876 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230565 17876 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230574 17876 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230579 17876 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: W0313 10:41:32.230585 17876 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: I0313 10:41:32.230592 17876 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: I0313 10:41:32.230796 17876 server.go:940] "Client rotation is on, will bootstrap in background" Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: I0313 10:41:32.233389 17876 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: I0313 10:41:32.233503 17876 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 13 10:41:32.238052 master-0 kubenswrapper[17876]: I0313 10:41:32.234115 17876 server.go:997] "Starting client certificate rotation" Mar 13 10:41:32.238637 master-0 kubenswrapper[17876]: I0313 10:41:32.234144 17876 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 13 10:41:32.238637 master-0 kubenswrapper[17876]: I0313 10:41:32.234489 17876 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 07:26:35.464206257 +0000 UTC Mar 13 10:41:32.238637 master-0 kubenswrapper[17876]: I0313 10:41:32.234641 17876 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h45m3.229569396s for next certificate rotation Mar 13 10:41:32.238637 master-0 kubenswrapper[17876]: I0313 10:41:32.235204 17876 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:41:32.238637 master-0 kubenswrapper[17876]: I0313 10:41:32.237302 17876 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 13 10:41:32.240377 master-0 kubenswrapper[17876]: I0313 10:41:32.240339 17876 log.go:25] "Validated CRI v1 runtime API" Mar 13 10:41:32.247791 master-0 kubenswrapper[17876]: I0313 10:41:32.247744 17876 log.go:25] "Validated CRI v1 image API" Mar 13 10:41:32.249830 master-0 kubenswrapper[17876]: I0313 10:41:32.249771 17876 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 13 10:41:32.260043 master-0 kubenswrapper[17876]: I0313 10:41:32.259971 17876 fs.go:135] Filesystem UUIDs: map[58e57e2d-ae5b-4324-bfe8-6d8d8bd04e58:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 13 10:41:32.261209 master-0 kubenswrapper[17876]: I0313 10:41:32.260024 17876 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5/userdata/shm major:0 minor:993 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc/userdata/shm major:0 minor:594 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm major:0 minor:266 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513/userdata/shm major:0 minor:1045 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/136e725a814882d97a92b91f392b5a4bb1498352a85819c564006fc0555c46b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/136e725a814882d97a92b91f392b5a4bb1498352a85819c564006fc0555c46b2/userdata/shm major:0 minor:391 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19f35bad4079f0b545148fd4db4666ab80db062f38092a6802b80cab4ec7982a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19f35bad4079f0b545148fd4db4666ab80db062f38092a6802b80cab4ec7982a/userdata/shm major:0 minor:461 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d/userdata/shm major:0 minor:835 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1d85f90b35c0a6fe94e4911c5e6e2a9798938c9acd1504a9008825c00646ea44/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1d85f90b35c0a6fe94e4911c5e6e2a9798938c9acd1504a9008825c00646ea44/userdata/shm major:0 minor:1070 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d/userdata/shm major:0 minor:447 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26320b73ca3fce1850dde3e75da5ccc58878b72f0f352ff1a9c176723a2b7d3d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26320b73ca3fce1850dde3e75da5ccc58878b72f0f352ff1a9c176723a2b7d3d/userdata/shm major:0 minor:459 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2659c5a6a41b8bd57f0bf3c1da691ca647e461b974a89f7c9f8fe2c464e9654a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2659c5a6a41b8bd57f0bf3c1da691ca647e461b974a89f7c9f8fe2c464e9654a/userdata/shm major:0 minor:590 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm major:0 minor:109 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53/userdata/shm major:0 minor:1047 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e4a3a4a7895f019e0118f1584bc95eca1f9c60af18c9d3fe595f768be766c6d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e4a3a4a7895f019e0118f1584bc95eca1f9c60af18c9d3fe595f768be766c6d/userdata/shm major:0 minor:457 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2/userdata/shm major:0 minor:844 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2fe7b69e87a4fa6425da976dffbe87c8c66862e1127867967d8f83ef262d49b7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2fe7b69e87a4fa6425da976dffbe87c8c66862e1127867967d8f83ef262d49b7/userdata/shm major:0 minor:456 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65/userdata/shm major:0 minor:90 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c/userdata/shm major:0 minor:426 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm major:0 minor:265 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/390d92c6b1bf8de4d4ea48cb675d878d3b2cbd2b0311fc47e5e4feef80f55449/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/390d92c6b1bf8de4d4ea48cb675d878d3b2cbd2b0311fc47e5e4feef80f55449/userdata/shm major:0 minor:458 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919/userdata/shm major:0 minor:491 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858/userdata/shm major:0 minor:701 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm major:0 minor:142 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4a9a41f76fe188e7c2fc303922714d8a4a4540bbc426c47477e0dbcbe14a461c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4a9a41f76fe188e7c2fc303922714d8a4a4540bbc426c47477e0dbcbe14a461c/userdata/shm major:0 minor:619 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4f70e184622d577e74124d1d17bc445ea80514437cbc221bcb9f2c6f012aa2ca/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4f70e184622d577e74124d1d17bc445ea80514437cbc221bcb9f2c6f012aa2ca/userdata/shm major:0 minor:700 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b/userdata/shm major:0 minor:821 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5759216ebfee850b79609783445de8124c370c8bac5b63e2b5f03e38c742e1f0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5759216ebfee850b79609783445de8124c370c8bac5b63e2b5f03e38c742e1f0/userdata/shm major:0 minor:718 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8/userdata/shm major:0 minor:764 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/646d9925ac7d679e5fe105dacc2e5ba2bf65b630c171bd0e095c89f902ecba0a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/646d9925ac7d679e5fe105dacc2e5ba2bf65b630c171bd0e095c89f902ecba0a/userdata/shm major:0 minor:460 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/64cac6ba3a561adbc8f8770dc2f28e49933388f06613c25151f7bbd0ceb39107/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/64cac6ba3a561adbc8f8770dc2f28e49933388f06613c25151f7bbd0ceb39107/userdata/shm major:0 minor:455 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c3bc64f22f8c58f9e978db84c7754f9ee2b132931d3190f29d081554cf105af/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c3bc64f22f8c58f9e978db84c7754f9ee2b132931d3190f29d081554cf105af/userdata/shm major:0 minor:74 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6d81df6e0c2c501a006e6d355e7ca64b7f375686077a624175b4786dbf2e5138/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6d81df6e0c2c501a006e6d355e7ca64b7f375686077a624175b4786dbf2e5138/userdata/shm major:0 minor:746 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm major:0 minor:246 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd/userdata/shm major:0 minor:620 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/77ae6dbbf39c4d2991c10b142e9d6fe23b3ada856897b7bc34aa3b7d69fa418b/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/77ae6dbbf39c4d2991c10b142e9d6fe23b3ada856897b7bc34aa3b7d69fa418b/userdata/shm major:0 minor:322 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/77b4f8a8bc891942c93fc6bc58a70209e4d2685ce12294e206b71662186490b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/77b4f8a8bc891942c93fc6bc58a70209e4d2685ce12294e206b71662186490b9/userdata/shm major:0 minor:452 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm major:0 minor:253 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8988806dc69dce5b61c53cc2845447a33f520244d709f93fdb6f76499aee8916/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8988806dc69dce5b61c53cc2845447a33f520244d709f93fdb6f76499aee8916/userdata/shm major:0 minor:808 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f/userdata/shm major:0 minor:837 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a/userdata/shm major:0 minor:972 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a9692d62aeb99fb7d4d3fc80637ffdf1ea3947790e26d640f42aacc16302c11/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a9692d62aeb99fb7d4d3fc80637ffdf1ea3947790e26d640f42aacc16302c11/userdata/shm major:0 minor:318 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190/userdata/shm major:0 minor:462 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9fb60bfa59d2ff40288f456815269ff4c838e82195edd334933c8654b4f8dedd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9fb60bfa59d2ff40288f456815269ff4c838e82195edd334933c8654b4f8dedd/userdata/shm major:0 minor:721 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm major:0 minor:105 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff/userdata/shm major:0 minor:974 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm major:0 minor:251 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm major:0 minor:245 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/af7a768842b9cbb587f10537824efb3089e2d3b4f70fb674c1d644bca3af49d7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/af7a768842b9cbb587f10537824efb3089e2d3b4f70fb674c1d644bca3af49d7/userdata/shm major:0 minor:1041 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b090a7b841b2284b4a367b1fe9eb531751b92400aca909b51b87e9d7691a206c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b090a7b841b2284b4a367b1fe9eb531751b92400aca909b51b87e9d7691a206c/userdata/shm major:0 minor:776 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f/userdata/shm major:0 minor:333 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b6406db9242e3599a9f6b43c6cc7f931a2398c12649757d5a331d9757d32028e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b6406db9242e3599a9f6b43c6cc7f931a2398c12649757d5a331d9757d32028e/userdata/shm major:0 minor:838 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm major:0 minor:271 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bcfacb71ae88d504692e95ad77d6c9b51c2d2697daec2bf687474302cc5abf90/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bcfacb71ae88d504692e95ad77d6c9b51c2d2697daec2bf687474302cc5abf90/userdata/shm major:0 minor:397 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa/userdata/shm major:0 minor:1039 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c918fb3b270e41c6d62b6e571b5882afaab66a46ce66ce229de4e70f9853f259/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c918fb3b270e41c6d62b6e571b5882afaab66a46ce66ce229de4e70f9853f259/userdata/shm major:0 minor:392 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d/userdata/shm major:0 minor:1017 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm major:0 minor:275 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e34fa9d84124b6c127298dbbcc66ee1981c2d493a18d9fee5da615255d116cb0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e34fa9d84124b6c127298dbbcc66ee1981c2d493a18d9fee5da615255d116cb0/userdata/shm major:0 minor:395 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e3e74e8a6d87769b2b8f6bdae5a948fbb44f464be31e39d10a8d9e290f6b63c1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e3e74e8a6d87769b2b8f6bdae5a948fbb44f464be31e39d10a8d9e290f6b63c1/userdata/shm major:0 minor:840 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm major:0 minor:256 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e95e82ba3152944d5f266f4315ecef6f288f0249fcf6dd92d242f6cd35eb008a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e95e82ba3152944d5f266f4315ecef6f288f0249fcf6dd92d242f6cd35eb008a/userdata/shm major:0 minor:719 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/edb84f3680f6b7a9122dea49c8ac75c4b3614e7e24eb119b118fbf82de0d5e2c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/edb84f3680f6b7a9122dea49c8ac75c4b3614e7e24eb119b118fbf82de0d5e2c/userdata/shm major:0 minor:843 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9/userdata/shm major:0 minor:308 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83/userdata/shm major:0 minor:46 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fe76c4da023ee8241529e5f2a6a092dc48a1a51d30db462a00bc458437ba96ee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fe76c4da023ee8241529e5f2a6a092dc48a1a51d30db462a00bc458437ba96ee/userdata/shm major:0 minor:980 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~projected/kube-api-access-v6lnq:{mountpoint:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~projected/kube-api-access-v6lnq major:0 minor:724 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/encryption-config major:0 minor:743 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/etcd-client major:0 minor:741 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/serving-cert major:0 minor:742 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h:{mountpoint:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~secret/srv-cert major:0 minor:444 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7:{mountpoint:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7 major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~secret/srv-cert major:0 minor:443 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~projected/kube-api-access-m5x2b:{mountpoint:/var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~projected/kube-api-access-m5x2b major:0 minor:155 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~secret/proxy-tls major:0 minor:102 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~projected/kube-api-access-6p29b:{mountpoint:/var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~projected/kube-api-access-6p29b major:0 minor:766 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:762 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~projected/kube-api-access-vjkdx:{mountpoint:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~projected/kube-api-access-vjkdx major:0 minor:820 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cert major:0 minor:819 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:814 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~projected/kube-api-access-q6smf:{mountpoint:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~projected/kube-api-access-q6smf major:0 minor:1081 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/certs major:0 minor:1077 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1076 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd:{mountpoint:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:441 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs:{mountpoint:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv:{mountpoint:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv major:0 minor:242 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:439 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8:{mountpoint:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8 major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2157cb66-d458-4353-bc9c-ef761e61e5c5/volumes/kubernetes.io~projected/kube-api-access-gntlk:{mountpoint:/var/lib/kubelet/pods/2157cb66-d458-4353-bc9c-ef761e61e5c5/volumes/kubernetes.io~projected/kube-api-access-gntlk major:0 minor:338 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~projected/kube-api-access-xl7xt:{mountpoint:/var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~projected/kube-api-access-xl7xt major:0 minor:684 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~secret/serving-cert major:0 minor:598 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z:{mountpoint:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z major:0 minor:243 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:386 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~projected/kube-api-access-4fcqg:{mountpoint:/var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~projected/kube-api-access-4fcqg major:0 minor:824 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:337 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/257ae542-4a06-42d3-b3e8-bf0a376494a8/volumes/kubernetes.io~projected/kube-api-access-fswp7:{mountpoint:/var/lib/kubelet/pods/257ae542-4a06-42d3-b3e8-bf0a376494a8/volumes/kubernetes.io~projected/kube-api-access-fswp7 major:0 minor:826 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/258f571e-5ec8-42df-b4ba-17457d87d10d/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/258f571e-5ec8-42df-b4ba-17457d87d10d/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1034 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp:{mountpoint:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp major:0 minor:238 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/277614e8-838f-4773-bcfc-89f19c620dee/volumes/kubernetes.io~projected/kube-api-access-jzvxz:{mountpoint:/var/lib/kubelet/pods/277614e8-838f-4773-bcfc-89f19c620dee/volumes/kubernetes.io~projected/kube-api-access-jzvxz major:0 minor:1038 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~projected/kube-api-access-htb49:{mountpoint:/var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~projected/kube-api-access-htb49 major:0 minor:332 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~secret/signing-key major:0 minor:331 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~projected/kube-api-access-4xqz6:{mountpoint:/var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~projected/kube-api-access-4xqz6 major:0 minor:763 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~secret/serving-cert major:0 minor:761 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~projected/kube-api-access-2gcf6:{mountpoint:/var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~projected/kube-api-access-2gcf6 major:0 minor:438 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~secret/metrics-tls major:0 minor:413 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw:{mountpoint:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv:{mountpoint:/var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj:{mountpoint:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~secret/webhook-certs major:0 minor:440 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~projected/kube-api-access-6vbsc:{mountpoint:/var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~projected/kube-api-access-6vbsc major:0 minor:827 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~secret/proxy-tls major:0 minor:825 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs:{mountpoint:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~projected/kube-api-access-ws7gk:{mountpoint:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~projected/kube-api-access-ws7gk major:0 minor:973 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config major:0 minor:971 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-tls:{mountpoint:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-tls major:0 minor:967 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7c5279e3-0165-4347-bfc7-87b80accaab3/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config:{mountpoint:/var/lib/kubelet/pods/7c5279e3-0165-4347-bfc7-87b80accaab3/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config major:0 minor:89 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/84f78350-e85c-4377-97cd-9e9a1b2ff4ee/volumes/kubernetes.io~projected/kube-api-access-d5v4b:{mountpoint:/var/lib/kubelet/pods/84f78350-e85c-4377-97cd-9e9a1b2ff4ee/volumes/kubernetes.io~projected/kube-api-access-d5v4b major:0 minor:420 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access major:0 minor:244 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8b07c5ae-1149-4031-bd92-6df4331e586c/volumes/kubernetes.io~projected/kube-api-access-4kn26:{mountpoint:/var/lib/kubelet/pods/8b07c5ae-1149-4031-bd92-6df4331e586c/volumes/kubernetes.io~projected/kube-api-access-4kn26 major:0 minor:299 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9:{mountpoint:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9 major:0 minor:103 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~projected/kube-api-access-vmf6l:{mountpoint:/var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~projected/kube-api-access-vmf6l major:0 minor:978 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:784 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7:{mountpoint:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~secret/metrics-certs major:0 minor:442 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~projected/kube-api-access-8z5fj:{mountpoint:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~projected/kube-api-access-8z5fj major:0 minor:1037 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/default-certificate major:0 minor:1033 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1029 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/stats-auth major:0 minor:1035 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~projected/kube-api-access-ffmmr:{mountpoint:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~projected/kube-api-access-ffmmr major:0 minor:832 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:831 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/webhook-cert major:0 minor:830 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~projected/kube-api-access-g8dpd:{mountpoint:/var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~projected/kube-api-access-g8dpd major:0 minor:307 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~secret/serving-cert major:0 minor:304 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~projected/kube-api-access-5rqms:{mountpoint:/var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~projected/kube-api-access-5rqms major:0 minor:1016 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~secret/proxy-tls major:0 minor:726 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c/volumes/kubernetes.io~projected/kube-api-access-ppjzw:{mountpoint:/var/lib/kubelet/pods/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c/volumes/kubernetes.io~projected/kube-api-access-ppjzw major:0 minor:437 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~projected/kube-api-access-qr5lp:{mountpoint:/var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~projected/kube-api-access-qr5lp major:0 minor:317 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~secret/webhook-certs major:0 minor:85 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~projected/kube-api-access-lrmcp:{mountpoint:/var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~projected/kube-api-access-lrmcp major:0 minor:829 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:828 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq:{mountpoint:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:445 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:556 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/tmp major:0 minor:555 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~projected/kube-api-access-b5gkd:{mountpoint:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~projected/kube-api-access-b5gkd major:0 minor:560 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88:{mountpoint:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88 major:0 minor:139 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/a3c9 Mar 13 10:41:32.263105 master-0 kubenswrapper[17876]: 1eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert major:0 minor:138 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a7b698d2-f23a-4404-bc63-757ca549356f/volumes/kubernetes.io~projected/kube-api-access-zltcf:{mountpoint:/var/lib/kubelet/pods/a7b698d2-f23a-4404-bc63-757ca549356f/volumes/kubernetes.io~projected/kube-api-access-zltcf major:0 minor:303 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~projected/kube-api-access-4nr6p:{mountpoint:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~projected/kube-api-access-4nr6p major:0 minor:611 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/encryption-config major:0 minor:609 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/etcd-client major:0 minor:610 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/serving-cert major:0 minor:608 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~projected/kube-api-access-cvl4j:{mountpoint:/var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~projected/kube-api-access-cvl4j major:0 minor:804 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:789 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b460735c-56aa-4dd3-a756-759859083e12/volumes/kubernetes.io~projected/kube-api-access-qr9x5:{mountpoint:/var/lib/kubelet/pods/b460735c-56aa-4dd3-a756-759859083e12/volumes/kubernetes.io~projected/kube-api-access-qr9x5 major:0 minor:1036 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~projected/kube-api-access-mnnnp:{mountpoint:/var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~projected/kube-api-access-mnnnp major:0 minor:336 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:335 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~projected/kube-api-access major:0 minor:398 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~secret/serving-cert major:0 minor:394 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~projected/kube-api-access-v9cxp:{mountpoint:/var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~projected/kube-api-access-v9cxp major:0 minor:976 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:842 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8:{mountpoint:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8 major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf:{mountpoint:/var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8:{mountpoint:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8 major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~secret/metrics-tls major:0 minor:389 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq:{mountpoint:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:388 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:390 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/ca-certs major:0 minor:429 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/kube-api-access-nqrh5:{mountpoint:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/kube-api-access-nqrh5 major:0 minor:449 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~projected/kube-api-access-chxxr:{mountpoint:/var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~projected/kube-api-access-chxxr major:0 minor:320 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~secret/cert major:0 minor:823 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768:{mountpoint:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768 major:0 minor:239 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~secret/metrics-tls major:0 minor:381 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t:{mountpoint:/var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2:{mountpoint:/var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2 major:0 minor:104 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/ca-certs major:0 minor:428 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/kube-api-access-jk4qr:{mountpoint:/var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/kube-api-access-jk4qr major:0 minor:450 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k:{mountpoint:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k major:0 minor:250 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2:{mountpoint:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2 major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f99b999c-4213-4d29-ab14-26c584e88445/volumes/kubernetes.io~projected/kube-api-access-bn7vq:{mountpoint:/var/lib/kubelet/pods/f99b999c-4213-4d29-ab14-26c584e88445/volumes/kubernetes.io~projected/kube-api-access-bn7vq major:0 minor:300 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fd91626c-38a8-462f-8bc0-96d57532de87/volumes/kubernetes.io~projected/kube-api-access-7mjm7:{mountpoint:/var/lib/kubelet/pods/fd91626c-38a8-462f-8bc0-96d57532de87/volumes/kubernetes.io~projected/kube-api-access-7mjm7 major:0 minor:433 fsType:tmpfs blockSize:0} overlay_0-1004:{mountpoint:/var/lib/containers/storage/overlay/5d4cd53cf2f17e0fbdc56d3b5d60931b8f01e43f27d8294d01c387cd0cc17ea6/merged major:0 minor:1004 fsType:overlay blockSize:0} overlay_0-1009:{mountpoint:/var/lib/containers/storage/overlay/25986b863388fa6199341f5c83249ba7349ccb6c97f726d447cafe23e2e4315d/merged major:0 minor:1009 fsType:overlay blockSize:0} overlay_0-1019:{mountpoint:/var/lib/containers/storage/overlay/72b2c59c41993b0c28e7a9d96b3d42f51c2d1fcd35289640ac145e5e316b6292/merged major:0 minor:1019 fsType:overlay blockSize:0} overlay_0-1021:{mountpoint:/var/lib/containers/storage/overlay/8c99ed8e518c5c8d149ff2945104cddeb24ddde2039a3de45b00791501073416/merged major:0 minor:1021 fsType:overlay blockSize:0} overlay_0-1023:{mountpoint:/var/lib/containers/storage/overlay/b64d58833ecda379f457e2c38a6c2fc3e6f1409c716c5f15b88afaf6c93c85b6/merged major:0 minor:1023 fsType:overlay blockSize:0} overlay_0-1043:{mountpoint:/var/lib/containers/storage/overlay/f0421c32fdec439463cf848825fb34d6105ef22d8bd2f0bd4e82cd9c0b54861c/merged major:0 minor:1043 fsType:overlay blockSize:0} overlay_0-1049:{mountpoint:/var/lib/containers/storage/overlay/c175b0c3e629d9b8471f5d63acc75b788e98986cba7dfb3e65d048e421eb5858/merged major:0 minor:1049 fsType:overlay blockSize:0} overlay_0-1053:{mountpoint:/var/lib/containers/storage/overlay/ce65c1cef0e3d2beb667826543fba95c7b23ef4d9f71546d7aabda249c88092b/merged major:0 minor:1053 fsType:overlay blockSize:0} overlay_0-1055:{mountpoint:/var/lib/containers/storage/overlay/52356411db031d7d107fc7eb92c9954030996ca6c83596834480915ba18679b0/merged major:0 minor:1055 fsType:overlay blockSize:0} overlay_0-1060:{mountpoint:/var/lib/containers/storage/overlay/ce55d3bc03608d4a46c24d4f7de4e7d92cb6dc70e8f92c764202a176a242fd31/merged major:0 minor:1060 fsType:overlay blockSize:0} overlay_0-1062:{mountpoint:/var/lib/containers/storage/overlay/f8c8aa4efb72ea52fde73c85477aaabad207f8e4a5b417534694782df65e4c8b/merged major:0 minor:1062 fsType:overlay blockSize:0} overlay_0-1068:{mountpoint:/var/lib/containers/storage/overlay/4ced0d61bcd4142d67e382036495ebb26be9efb9f7ee0d6514741e82fdf71aa6/merged major:0 minor:1068 fsType:overlay blockSize:0} overlay_0-107:{mountpoint:/var/lib/containers/storage/overlay/51ee4ddf1cec0b7db149f561d502a0ed68d3f058aff373243fa18c6e50637da7/merged major:0 minor:107 fsType:overlay blockSize:0} overlay_0-1082:{mountpoint:/var/lib/containers/storage/overlay/2aaedf6d2356c3c001764bda619d47a95a00f38dd6c6317d23105ca1b134a3a4/merged major:0 minor:1082 fsType:overlay blockSize:0} overlay_0-1084:{mountpoint:/var/lib/containers/storage/overlay/b5593ec14f980812d04dfc70dff0a54a77f8bd48e494b03ce61f90464bf9abc5/merged major:0 minor:1084 fsType:overlay blockSize:0} overlay_0-1093:{mountpoint:/var/lib/containers/storage/overlay/63c5f4d3e2eede7cbc31a6ab8278f954a2da31b601cffa2c531234a27f6e8b8a/merged major:0 minor:1093 fsType:overlay blockSize:0} overlay_0-1095:{mountpoint:/var/lib/containers/storage/overlay/79a5fce7497c4d197915c9e5e8a76e7fcba9a016b3db1bc8c596b7d8ef7b01f3/merged major:0 minor:1095 fsType:overlay blockSize:0} overlay_0-1097:{mountpoint:/var/lib/containers/storage/overlay/1415d6e2e674dcf6ae0c887796dcea09a6ce7d752f8c36b8790d85c516854a05/merged major:0 minor:1097 fsType:overlay blockSize:0} overlay_0-1103:{mountpoint:/var/lib/containers/storage/overlay/7d954c41be93a6912db45c3334ba588df1a047dc098fa6ca72d474818bdc96b5/merged major:0 minor:1103 fsType:overlay blockSize:0} overlay_0-1108:{mountpoint:/var/lib/containers/storage/overlay/025f92e1719f7e77f4335fd853f6c2f30f4ad66548646bb74655a38d38df1204/merged major:0 minor:1108 fsType:overlay blockSize:0} overlay_0-111:{mountpoint:/var/lib/containers/storage/overlay/a2b25121abd050ee1077647cd54f1731a5d154c9e8827bc213a663b0943aba5f/merged major:0 minor:111 fsType:overlay blockSize:0} overlay_0-1116:{mountpoint:/var/lib/containers/storage/overlay/e8c471a93b7e15872f7574fa23515209a8291272395a424f62cefca9e9c1df5c/merged major:0 minor:1116 fsType:overlay blockSize:0} overlay_0-1121:{mountpoint:/var/lib/containers/storage/overlay/5965dc8b0a31f6d70e3c5887ff98792d10aa82d87e0073b9d16ae88ba1f19bda/merged major:0 minor:1121 fsType:overlay blockSize:0} overlay_0-1126:{mountpoint:/var/lib/containers/storage/overlay/cb56fd746e67e211234d97421dfe34d75b40343fa5fbb311fda93938dfd471cc/merged major:0 minor:1126 fsType:overlay blockSize:0} overlay_0-114:{mountpoint:/var/lib/containers/storage/overlay/e3a9aef77397681738561e9622cd1cbc0efec29aa6517f286a6d818c09b2b756/merged major:0 minor:114 fsType:overlay blockSize:0} overlay_0-116:{mountpoint:/var/lib/containers/storage/overlay/9029ae3c8111f47de861db2a0e29b3275fe9142bb83d246e8ca4dfbad6a0e7a3/merged major:0 minor:116 fsType:overlay blockSize:0} overlay_0-118:{mountpoint:/var/lib/containers/storage/overlay/e1d135856ea7b9f25e9d58d18f1fe31be82b86e9c2f5ce4ac84364aa9a9fb261/merged major:0 minor:118 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/d454cda786f8b4e7e94db42bc98005f38a20d8041248a399f447b9403208c68d/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/5d1950ddba4d37696d6e2782d7bfa3f829e8c0a5bd53b7dff381b344e4009ec7/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/9177e38a4c08bbb28fbc771f9305d06e7095d26e22fcc940217219997aaafba2/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/13d6ea04e1be1e67e086be3e3dd0033fd79232507de8553b7441b55ebf1985ee/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-140:{mountpoint:/var/lib/containers/storage/overlay/5da52f1bda4e856534bf0ff70330e601aebebcc75c60b5513f5899f743a6feb9/merged major:0 minor:140 fsType:overlay blockSize:0} overlay_0-144:{mountpoint:/var/lib/containers/storage/overlay/6d5b96f8d256abc5e49d0b929a834320335035114d9c04dfe2c43b427cfc3aaf/merged major:0 minor:144 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/ffe227a591777d11831a5f2e26ec7e117e21716ac77cf7bf12c385f040edc6db/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/737e9c58faee00c4c2872b97e8b290fc78c01720c0546498e61fa21b975b3156/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-159:{mountpoint:/var/lib/containers/storage/overlay/9868f778aac90102362a64b2880fbe5a146631be470892b8fac2a1d11fc7cd59/merged major:0 minor:159 fsType:overlay blockSize:0} overlay_0-166:{mountpoint:/var/lib/containers/storage/overlay/b384c42209d804b06b6915392cc8a22c18073f1a4ceffe284fa3e224021c6be2/merged major:0 minor:166 fsType:overlay blockSize:0} overlay_0-171:{mountpoint:/var/lib/containers/storage/overlay/6eaa2cbee3e1d380f878b1ac04b4c710667081431b9d5c003e02b8fd12f48952/merged major:0 minor:171 fsType:overlay blockSize:0} overlay_0-178:{mountpoint:/var/lib/containers/storage/overlay/7c602a28fed4530a6cb28784078742854765a3c294765ae00e49d97b74cc681f/merged major:0 minor:178 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/43dc212ad089f6ff2917c951f147af293821268e11330d15db5a91fb9343271e/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/5e5885bfaca2d8c1f291e909673bacbb8cc59606cee3b2e9ed301be872fd0fb4/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/6d36a3eb9028da7f760341e60345bbf4023ec073aceb9dd458926d24593c6f4e/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/5d8122c338f3daedb4bb19657253f8795a9753bbf6e3019bf7fec9d3fcc8a2f4/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-199:{mountpoint:/var/lib/containers/storage/overlay/b950239c02165cf8a57c5cc1570c7f4b09d60d2cf003641ca7dea33e126ff115/merged major:0 minor:199 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/9d563e4afb673a69a96b12b27d43ad8a0bb745b536e0716838cab0d12fcd9c59/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/8de371efd96578ccc939d25d4b30c41156f73a21bd0c65812508317c599aae3d/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/f899e18fa8dbf3319e0f2fa72649691ff5ab04c9a826218dc89a8718b8b3b1cb/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-279:{mountpoint:/var/lib/containers/storage/overlay/ab26f40fd99753f5bd3fafeb100b829d22e99dd20f3d1152feb0a088c628c3ac/merged major:0 minor:279 fsType:overlay blockSize:0} overlay_0-281:{mountpoint:/var/lib/containers/storage/overlay/3c13b3b5f7fe1c6a8072b8566f3bb234441deefc87e04a62055d0a816d37f30d/merged major:0 minor:281 fsType:overlay blockSize:0} overlay_0-283:{mountpoint:/var/lib/containers/storage/overlay/259115946e4112665a27b644eb3b91bee2b22821ed4347acdf601e6960a0224f/merged major:0 minor:283 fsType:overlay blockSize:0} overlay_0-285:{mountpoint:/var/lib/containers/storage/overlay/0fb4682fb6a43f9080f194d26b5048f3223f37b986c910a3620d951d23c75d68/merged major:0 minor:285 fsType:overlay blockSize:0} overlay_0-287:{mountpoint:/var/lib/containers/storage/overlay/653aa83583d1b46bc3b7b604338b291c9dce41984a7e631867276167378628ef/merged major:0 minor:287 fsType:overlay blockSize:0} overlay_0-289:{mountpoint:/var/lib/containers/storage/overlay/3e085f9531fbb969a97a96d7bfaa906838dfd5273709b75371d1e0c43db1b2e5/merged major:0 minor:289 fsType:overlay blockSize:0} overlay_0-291:{mountpoint:/var/lib/containers/storage/overlay/6054552f2c76bbf24cd482c1b3f5dbe5856cfbebba509a62813070a37faccbbc/merged major:0 minor:291 fsType:overlay blockSize:0} overlay_0-293:{mountpoint:/var/lib/containers/storage/overlay/77ad5fbff258647a532dc461679dc56b9f856c17aecd286d3b6bdce623541d6b/merged major:0 minor:293 fsType:overlay blockSize:0} overlay_0-295:{mountpoint:/var/lib/containers/storage/overlay/c71d16d546cb0676dbd656400e4e52aca6a9e7fe8800ec2c312f5dcffa93fd43/merged major:0 minor:295 fsType:overlay blockSize:0} overlay_0-297:{mountpoint:/var/lib/containers/storage/overlay/4417774e3b57d233551490f3238b8b38fada92b33064129303e9a293bb033c05/merged major:0 minor:297 fsType:overlay blockSize:0} overlay_0-301:{mountpoint:/var/lib/containers/storage/overlay/eb828c4a888ea55c86f7686c5e47943027d16f2f7a09d4a5910a4a16b93beb52/merged major:0 minor:301 fsType:overlay blockSize:0} overlay_0-305:{mountpoint:/var/lib/containers/storage/overlay/80f944536cd1878c252785b54025214386fb3c106f309487b130a5799697eb43/merged major:0 minor:305 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/9853caf89c97d3b18857893c85178bc6bad7ed925af7193bf6e88992c2745e68/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/b93e02114017d4a502302bcfbc216af47b7df1f07ee97013c30607a5008fdbb1/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-321:{mountpoint:/var/lib/containers/storage/overlay/f3555979896b088600749270a04ce78e37226297242e862ca051e72e866a5d6a/merged major:0 minor:321 fsType:overlay blockSize:0} overlay_0-323:{mountpoint:/var/lib/containers/storage/overlay/feeb2c3e970e2195680a4193bc288030602d1946c38693837728e8698bf25af9/merged major:0 minor:323 fsType:overlay blockSize:0} overlay_0-346:{mountpoint:/var/lib/containers/storage/overlay/e18a8e192a649c6039303fd02d8de4cdd3209663421360d5930a9ae2ac177110/merged major:0 minor:346 fsType:overlay blockSize:0} overlay_0-350:{mountpoint:/var/lib/containers/storage/overlay/ef6ff8d8cf0e7342fc502a9e0f569983f7d87b5eabfd352a1f0eabd36f587a41/merged major:0 minor:350 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/c1c54284e533e053376a9a454f73864743722251ce2504ab0b6483ea299af071/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-361:{mountpoint:/var/lib/containers/storage/overlay/585fb7bc077accf78a644a9959d355ddb9b5eec95533c544f6b9caa3652794ef/merged major:0 minor:361 fsType:overlay blockSize:0} overlay_0-365:{mountpoint:/var/lib/containers/storage/overlay/48db8062050b819d62b0f61c9985f3cac2a02b7543547313f73a81b830f137d4/merged major:0 minor:365 fsType:overlay blockSize:0} overlay_0-366:{mountpoint:/var/lib/containers/storage/overlay/83a78e24f11b50dfb3f62bc8b50cfaee3d5a7352cc5f0179c144289bb9e0b727/merged major:0 minor:366 fsType:overlay blockSize:0} overlay_0-369:{mountpoint:/var/lib/containers/storage/overlay/1f0d22951feedbbccda6e50f466b17627a5f12842d7cc88479385826d41f35f4/merged major:0 minor:369 fsType:overlay blockSize:0} overlay_0-387:{mountpoint:/var/lib/containers/storage/overlay/fae3f3c178374e47a9918535df9b44a01f2455eb94744c526e25bcd9a7c7fc00/merged major:0 minor:387 fsType:overlay blockSize:0} overlay_0-400:{mountpoint:/var/lib/containers/storage/overlay/96632c38a560b235a8f6158dc8ae7a907673475f686938cac95e1114fa161839/merged major:0 minor:400 fsType:overlay blockSize:0} overlay_0-403:{mountpoint:/var/lib/containers/storage/overlay/4b9effdf2d12829b70cb61788436f1a806ecfbeeb4122aac09fad75ba25a9b5b/merged major:0 minor:403 fsType:overlay blockSize:0} overlay_0-405:{mountpoint:/var/lib/containers/storage/overlay/5b59c6bb6230abced75b12d45a5ff2d05de746bff93ed864ff0302fc3ad8807e/merged major:0 minor:405 fsType:overlay blockSize:0} overlay_0-407:{mountpoint:/var/lib/containers/storage/overlay/31018af83795eb943a28a60a9c8bd3032c080a081c0b19df2a61482c119dffc5/merged major:0 minor:407 fsType:overlay blockSize:0} overlay_0-409:{mountpoint:/var/lib/containers/storage/overlay/59b660e2fe88cb29559f40091f147b3e31e0f2cf7493733f002665b5292170f0/merged major:0 minor:409 fsType:overlay blockSize:0} overlay_0-41:{mountpoint:/var/lib/containers/storage/overlay/0c2aa20999f4d781eca6400565a8bda800ae591a1d0c028161bc68f914999c35/merged major:0 minor:41 fsType:overlay blockSize:0} overlay_0-411:{mountpoint:/var/lib/containers/storage/overlay/f874fce4b0958685b58ac25afea10360ed903a1bbd39d8e8233175b0967eee0f/merged major:0 minor:411 fsType:overlay blockSize:0} overlay_0-418:{mountpoint:/var/lib/containers/storage/overlay/23e47ce4a25576003f325e73cb27e5fdfa14f5c9cbf6c227cf2a0f19805ad507/merged major:0 minor:418 fsType:overlay blockSize:0} overlay_0-424:{mountpoint:/var/lib/containers/storage/overlay/b3b90344bfd8cec1ac228c8ab0f039b0b48d767b1fd2b00c0fadd0ea03bd7f63/merged major:0 minor:424 fsType:overlay blockSize:0} overlay_0-430:{mountpoint:/var/lib/containers/storage/overlay/163a86c73dbffa7af2a1c6d5978329c6aa7fdacc1f513cf51e0d1b964cdc70fc/merged major:0 minor:430 fsType:overlay blockSize:0} overlay_0-472:{mountpoint:/var/lib/containers/storage/overlay/fa1343950ebf068700d075ab90b925fd71717ceb76d16e38cd08bb81a932996c/merged major:0 minor:472 fsType:overlay blockSize:0} overlay_0-474:{mountpoint:/var/lib/containers/storage/overlay/5872c42be9f2c753125f44508c0fb78954730a93a575cb1372af79427dcd96f6/merged major:0 minor:474 fsType:overlay blockSize:0} overlay_0-476:{mountpoint:/var/lib/containers/storage/overlay/93c4d69430f76e88c3a0cb23fd09d794892aaf845228608d3d0e41a896891454/merged major:0 minor:476 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/8a6c5602b44bbf86b7d99863c6b93a6e3f2a81d50c5fccb5f4866d781aa21621/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-480:{mountpoint:/var/lib/containers/storage/overlay/3a645fdf84e28056c1a7dea63be91dd076870d7860c7ae5ff01691e5a69989ed/merged major:0 minor:480 fsType:overlay blockSize:0} overlay_0-482:{mountpoint:/var/lib/containers/storage/overlay/c174305f8cf023c585ae604e7089631f632b55f44dadabde281bd82ae4cd606a/merged major:0 minor:482 fsType:overlay blockSize:0} overlay_0-487:{mountpoint:/var/lib/containers/storage/overlay/8665457f6058c9b19233a144b64b0d489d83ccff26a62372771b9b6f28c5633d/merged major:0 minor:487 fsType:overlay blockSize:0} overlay_0-489:{mountpoint:/var/lib/containers/storage/overlay/c71765436b1cbe0de099ece61af0bad6de27cb11edee0f43d34e39d1d7f5ba63/merged major:0 minor:489 fsType:overlay blockSize:0} overlay_0-493:{mountpoint:/var/lib/containers/storage/overlay/3ac3a5dfaa588c29e7f4787d653280a5d52699bf77602f079b77f668d47ac810/merged major:0 minor:493 fsType:overlay blockSize:0} overlay_0-503:{mountpoint:/var/lib/containers/storage/overlay/cf803b6eb5b4444616191cafca426b650b0d8c2203ae689c87d091c1fe6de070/merged major:0 minor:503 fsType:overlay blockSize:0} overlay_0-510:{mountpoint:/var/lib/containers/storage/overlay/204b47c47f0d8b8ccd1eac35e01bf36995f4714f36124a66fd439b5859bf1d8c/merged major:0 minor:510 fsType:overlay blockSize:0} overlay_0-511:{mountpoint:/var/lib/containers/storage/overlay/0e1b71a665c50689697e2bec4de3e20c1bb9c7953c5681f8f45a530af46d7857/merged major:0 minor:511 fsType:overlay blockSize:0} overlay_0-515:{mountpoint:/var/lib/containers/storage/overlay/3e8bb776e2192987e4aeba8e4732a16c727bd227043d959c651fedc64c479adb/merged major:0 minor:515 fsType:overlay blockSize:0} overlay_0-519:{mountpoint:/var/lib/containers/storage/overlay/553bcb79e6b2d4c8f562c774f907fd9c97874cce0d57cc78447dd5a4f29f4d61/merged major:0 minor:519 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/24dca11bfc7ea37f221888024b9a1eb036ef0079177c980e95515f4c4fd8195b/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-525:{mountpoint:/var/lib/containers/storage/overlay/13a6e8b72118ad53a8af21a8376d0ead0179ba2230fee53bb151b2e17ec4d0d8/merged major:0 minor:525 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/5060959c192db37cdfd7042a3cdab7c3d7c272d398605ac32f98f7c3048b2e25/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-532:{mountpoint:/var/lib/containers/storage/overlay/78660dd09513ca973b5784dcef7a037fc029428f13faef2445527bb5c84f922b/merged major:0 minor:532 fsType:overlay blockSize:0} overlay_0-534:{mountpoint:/var/lib/containers/storage/overlay/13b15638897a78b319a69b48231f6a955820199b1e7acf6f273321efff644b92/merged major:0 minor:534 fsType:overlay blockSize:0} overlay_0-536:{mountpoint:/var/lib/containers/storage/overlay/711ab510c9446f474d34d65c9537e05b667552303356bebd19ac4ed68f6e8239/merged major:0 minor:536 fsType:overlay blockSize:0} overlay_0-538:{mountpoint:/var/lib/containers/storage/overlay/2586569c36133e00ad09daeabfd807f8386d981968d46a96317e9814eeed8824/merged major:0 minor:538 fsType:overlay blockSize:0} overlay_0-54:{mountpoint:/var/lib/containers/storage/overlay/6649ad7e63213410c50f55539c49049c1ef19b88571a7022161a4639c844f06c/merged major:0 minor:54 fsType:overlay blockSize:0} overlay_0-540:{mountpoint:/var/lib/containers/storage/overlay/7473b5f44ef6e2f4a74ec711334e31b86c1213319ab05bc42f0c3ad00bbdc0c7/merged major:0 minor:540 fsType:overlay blockSize:0} overlay_0-543:{mountpoint:/var/lib/containers/storage/overlay/0da1861f5e0b306114127a234869f8a8223a6ef028d289bfc7425354e601cfce/merged major:0 minor:543 fsType:overlay blockSize:0} overlay_0-547:{mountpoint:/var/lib/containers/storage/overlay/ba040d4d8df72048356c5e72106361fbdec89d0d4e0bf6a7d24898a718960519/merged major:0 minor:547 fsType:overlay blockSize:0} overlay_0-554:{mountpoint:/var/lib/containers/storage/overlay/244986acb246b6bee67893b13e67108b85bc7706f6a6b387b194260ef519eaaf/merged major:0 minor:554 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/4c5dc0ca796bd3714914b2eb395c2323068b733458d9d0d1999682238ad40acf/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-562:{mountpoint:/var/lib/containers/storage/overlay/920c457799a98940dd02aad82158adf2f1f65450e70c522d3e39a4a7354cfe35/merged major:0 minor:562 fsType:overlay blockSize:0} overlay_0-564:{mountpoint:/var/lib/containers/storage/overlay/60b6d5bfd03d5ac6f675063e4564eb9e2abf586b17773c7657e5309628562954/merged major:0 minor:564 fsType:overlay blockSize:0} overlay_0-568:{mountpoint:/var/lib/containers/storage/overlay/3598cb7e678193710df04b5462227bd0255ad686399be641f5df65b32ad7f1de/merged major:0 minor:568 fsType:overlay blockSize:0} overlay_0-592:{mountpoint:/var/lib/containers/storage/overlay/9f336a370977d8dad887b43f664b823b93c35b7d1b22711961fdda635b2e0c14/merged major:0 minor:592 fsType:overlay blockSize:0} overlay_0-596:{mountpoint:/var/lib/containers/storage/overlay/b2e59a1b07fa5b9ab5e66fb859f27c29bff592dcab729d959d49401d3a3d61c7/merged major:0 minor:596 fsType:overlay blockSize:0} overlay_0-599:{mountpoint:/var/lib/containers/storage/overlay/7f786dd49979bf64924a3e8df99df221e176956187b24629fcadfe7ffd060510/merged major:0 minor:599 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/01bc2fe9632e5a08a52a2de09638c6788928d404549a020ac4cd4a493eb03249/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/76161f039c79d78c692c416fd9515a0c0b9005a9bcb667f17ae7c6131ec66935/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-614:{mountpoint:/var/lib/containers/storage/overlay/e63a3c37abc98238dc922d44d1790d1fa188a1c57f95ad2de48a9bc8030b933a/merged major:0 minor:614 fsType:overlay blockSize:0} overlay_0-617:{mountpoint:/var/lib/containers/storage/overlay/268f612776e7ef5580c9f26fb0e778f9f12ac43a2abb1513c884c23630c90b4e/merged major:0 minor:617 fsType:overlay blockSize:0} overlay_0-618:{mountpoint:/var/lib/containers/storage/overlay/d92d4aa26235d4320b622c772eae422ec2c27470c7e6ab071459a10382492fda/merged major:0 minor:618 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/39a4c47fd7d1751606f47d8a0c9d7017eac6c86c82b4792f64ed6234ac9c606b/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-627:{mountpoint:/var/lib/containers/storage/overlay/5494522d33d55ec0ae36a5e7b8b3af79a5af9d558819cbf81baa2c4ec94482c2/merged major:0 minor:627 fsType:overlay blockSize:0} overlay_0-628:{mountpoint:/var/lib/containers/storage/overlay/8182a6c352631eb89d7c7a95772a5c945193b3a41c25a9278f1b632ef9e4d04f/merged major:0 minor:628 fsType:overlay blockSize:0} overlay_0-630:{mountpoint:/var/lib/containers/storage/overlay/42c3d04baf36e4141d8bb327a5bdaa1f05ae76ebe35150556dc952ed067e956c/merged major:0 minor:630 fsType:overlay blockSize:0} overlay_0-633:{mountpoint:/var/lib/containers/storage/overlay/55c350069582d92af6bcd8870b3e79bfa7b7859f9af8b1036e87deefdf22c057/merged major:0 minor:633 fsType:overlay blockSize:0} overlay_0-635:{mountpoint:/var/lib/containers/storage/overlay/2d413f48d772f713684dda782dab4f7f22cace9c41b3ece48c69c911f76193ac/merged major:0 minor:635 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/d8025332e8e69ec6eae766057365c019059d2c5caa889298536879c7caabf21e/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-641:{mountpoint:/var/lib/containers/storage/overlay/39daa044c34a64da3cf153e0ba75167cc9a8a366052f71ebf1db5c3c92d936ff/merged major:0 minor:641 fsType:overlay blockSize:0} overlay_0-643:{mountpoint:/var/lib/containers/storage/overlay/548bcc70f4a0c3613ffeb48909c4c363309d3af43d4566d91b5d63374434a60a/merged major:0 minor:643 fsType:overlay blockSize:0} overlay_0-647:{mountpoint:/var/lib/containers/storage/overlay/770ca639fa8f02f009f5fd086cd3df967304dc4d0671f4fb5c5f8f81a81217e9/merged major:0 minor:647 fsType:overlay blockSize:0} overlay_0-649:{mountpoint:/var/lib/containers/storage/overlay/efdac1e546c9d5710b7877015d0dddd44f975fbce89803f9ae0568134a9ae53f/merged major:0 minor:649 fsType:overlay blockSize:0} overlay_0-651:{mountpoint:/var/lib/containers/storage/overlay/8c2e1462c2355518dfbd15bad30f2f79b364ae720a5ac62bcc409ded733d5192/merged major:0 minor:651 fsType:overlay blockSize:0} overlay_0-653:{mountpoint:/var/lib/containers/storage/overlay/8d1b9c35bac6579123f42e5b6bbfe934e6fc1b0c2ad6c51abb9d043233f4b877/merged major:0 minor:653 fsType:overlay blockSize:0} overlay_0-656:{mountpoint:/var/lib/containers/storage/overlay/7f43c2a2d4572b28e3cd661a956ba0f4aa7b3487c63e98d1288f81f7a6d7e0df/merged major:0 minor:656 fsType:overlay blockSize:0} overlay_0-658:{mountpoint:/var/lib/containers/storage/overlay/8d5e5cc9acd98c8c9850ee02b0aefd22ee3a5ebe83e1d73127e3cedec1309bde/merged major:0 minor:658 fsType:overlay blockSize:0} overlay_0-661:{mountpoint:/var/lib/containers/storage/overlay/8695908ceb2b4414f64ace08799b5291146091bbd1bcc674d00a95f126d4a29c/merged major:0 minor:661 fsType:overlay blockSize:0} overlay_0-664:{mountpoint:/var/lib/containers/storage/overlay/880dadf8ed7011a1de046e5b91a2cb12f8d5cf76a7fbd493bcc767a77bcc025f/merged major:0 minor:664 fsType:overlay blockSize:0} overlay_0-666:{mountpoint:/var/lib/containers/storage/overlay/cd5bf7b00a796a9bd1361c09e944230379488aea61cdd52d5a1eeedef0aa95ee/merged major:0 minor:666 fsType:overlay blockSize:0} overlay_0-667:{mountpoint:/var/lib/containers/storage/overlay/5b7877a995aa1ae8fcbb634cba89e17449a1d59735605bc4aa670058000a089b/merged major:0 minor:667 fsType:overlay blockSize:0} overlay_0-685:{mountpoint:/var/lib/containers/storage/overlay/718f6d29ae1cabba70ba61c39073e43a6ccb1a94dad1845ffc9c3d490cf6e6db/merged major:0 minor:685 fsType:overlay blockSize:0} overlay_0-688:{mountpoint:/var/lib/containers/storage/overlay/5c78536be674f7eae2aeb32a9f037c33d30291e9b24d34e9b3b23f78ece30754/merged major:0 minor:688 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/3757136ffedaa89354ba98a725348a61a9e74b5d15e2a2257bc243912ac71fd5/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-692:{mountpoint:/var/lib/containers/storage/overlay/9fb0bc7f2d17384ef6ffd82f53ad9ea68ad661cbeeb4c0faaf09c3644caaf13e/merged major:0 minor:692 fsType:overlay blockSize:0} overlay_0-70:{mountpoint:/var/lib/containers/storage/overlay/0bf8904185efc5d3d28a3000cf9d8d47597429ec838f3ef90a7921c75f86cb97/merged major:0 minor:70 fsType:overlay blockSize:0} overlay_0-703:{mountpoint:/var/lib/containers/storage/overlay/50d00035605f697ba8ddbef4950e7ef2c0cb7551928321e7174e699ce79986a9/merged major:0 minor:703 fsType:overlay blockSize:0} overlay_0-722:{mountpoint:/var/lib/containers/storage/overlay/7e3e9b7f578b37434753f647b08614304fb0b4a3c07f7e34eadb46b8c92c7530/merged major:0 minor:722 fsType:overlay blockSize:0} overlay_0-728:{mountpoint:/var/lib/containers/storage/overlay/074561b50ec82e8b792e4f60805569d005537a12dff2998e0c856d95df4675aa/merged major:0 minor:728 fsType:overlay blockSize:0} overlay_0-735:{mountpoint:/var/lib/containers/storage/overlay/ae9f3ea0a664515c4a58d41f5c94247ff3289dfef85eae4f1fff00987c4bb48c/merged major:0 minor:735 fsType:overlay blockSize:0} overlay_0-749:{mountpoint:/var/lib/containers/storage/overlay/aac8f69142350859f46b62220b2dfca405e52af1b1025fb56724c9460bd950af/merged major:0 minor:749 fsType:overlay blockSize:0} overlay_0-75:{mountpoint:/var/lib/containers/storage/overlay/cd0de6fb6db44bee5d894247ab28e0f3f9241912641774dffc7592845271ac31/merged major:0 minor:75 fsType:overlay blockSize:0} overlay_0-750:{mountpoint:/var/lib/containers/storage/overlay/bbeb27e6985803c112377a8010f3f8cb53ecbf35cbf7129ddbf2e57e33d67384/merged major:0 minor:750 fsType:overlay blockSize:0} overlay_0-760:{mountpoint:/var/lib/containers/storage/overlay/82862181b062275b497047fcfa9fa53f0fc12299829a8d93293d02771c83cbee/merged major:0 minor:760 fsType:overlay blockSize:0} overlay_0-767:{mountpoint:/var/lib/containers/storage/overlay/62e4d0936d506a81e9b331ae9f86f5c45dda626abafb0f0590a31ae7a5a2c15c/merged major:0 minor:767 fsType:overlay blockSize:0} overlay_0-769:{mountpoint:/var/lib/containers/storage/overlay/6f33e811c0553d944a83a6241bb618390cb22e79e030f3d6dc25af421ce08421/merged major:0 minor:769 fsType:overlay blockSize:0} overlay_0-780:{mountpoint:/var/lib/containers/storage/overlay/55ece25ebaa63ef5a0b46bb348f4b6d676860ce63952fac9fdfc52ef770aab13/merged major:0 minor:780 fsType:overlay blockSize:0} overlay_0-781:{mountpoint:/var/lib/containers/storage/overlay/112c160750eb265c6de62e230ce323c983c98d136dcd7bdee91a43bbfb000096/merged major:0 minor:781 fsType:overlay blockSize:0} overlay_0-785:{mountpoint:/var/lib/containers/storage/overlay/8c5c277b41715fa0d597257897f7c5855fd4cc4582dd732dd2a468f047d40419/merged major:0 minor:785 fsType:overlay blockSize:0} overlay_0-797:{mountpoint:/var/lib/containers/storage/overlay/ecabe61904b9962254c4e38b9fa3884f8d93819db22e370e1a1a42d3f3e4211a/merged major:0 minor:797 fsType:overlay blockSize:0} overlay_0-799:{mountpoint:/var/lib/containers/storage/overlay/d27218624281a94d2dd998b2bb0138812961d1bafa8a197366837bba2d0434fd/merged major:0 minor:799 fsType:overlay blockSize:0} overlay_0-810:{mountpoint:/var/lib/containers/storage/overlay/df36e408bee30eca531ed901ada7936acd6d94aef2b4ac28573cbe0546143689/merged major:0 minor:810 fsType:overlay blockSize:0} overlay_0-812:{mountpoint:/var/lib/containers/storage/overlay/85843b37f00d614fbf785910c44983915b4c6194e4222018784344729e9ad63a/merged major:0 minor:812 fsType:overlay blockSize:0} overlay_0-82:{mountpoint:/var/lib/containers/storage/overlay/d69e83f99dd1febf490a1fd20aa164fcfc791ced074e35627511d280ac1cb054/merged major:0 minor:82 fsType:overlay blockSize:0} overlay_0-833:{mountpoint:/var/lib/containers/storage/overlay/b6a2914a73adf506d5cd8e423072e0938cb8e1ba74825e449ced0f29448948aa/merged major:0 minor:833 fsType:overlay blockSize:0} overlay_0-851:{mountpoint:/var/lib/containers/storage/overlay/4c0bc6c7747dcf2878b4ed656a13450de23c90b3b8bbe1384f38e209389987e9/merged major:0 minor:851 fsType:overlay blockSize:0} overlay_0-854:{mountpoint:/var/lib/containers/storage/overlay/fd4f544ae7a023b41a0b758e0e9aef039b03639e909ca8d4b18997c925dd6992/merged major:0 minor:854 fsType:overlay blockSize:0} overlay_0-857:{mountpoint:/var/lib/containers/storage/overlay/5e23378ad1a7018fd5788398d32c522dc1bb7af777ce05a562ed8aa2af0a7a57/merged major:0 minor:857 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/4c6107cfa44198c864b77635c3e3549fef2dbffbc03989a832148355b656963a/merged major:0 minor:86 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/b2fbd7d834d3070365d79cc7e0d98969efb9f40fcb6a99a626d2617eb843df5e/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-864:{mountpoint:/var/lib/containers/storage/overlay/0c4b991d0186a8ac1b70754b66f206d3b40b67090b2411515351504aa8daf772/merged major:0 minor:864 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/f142259bb1995c6af61c094b83463c6c0cdccc711dbffa691d40b624196edac1/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-889:{mountpoint:/var/lib/containers/storage/overlay/7a7b2ed84d5f3a8ef8aa259c84717cde2664beb16d1f406c84ee7e2e2e097cf2/merged major:0 minor:889 fsType:overlay blockSize:0} overlay_0-892:{mountpoint:/var/lib/containers/storage/overlay/e9bd292c1c0462cbb43f2f4ea85a4d0d71a49324e75b557a3c1a6157f9d8a9ea/merged major:0 minor:892 fsType:overlay blockSize:0} overlay_0-894:{mountpoint:/var/lib/containers/storage/overlay/bdda390bb8b94c3e639caf591ac0a6c10050a67b5a09209ed6755e8ae7226a2a/merged major:0 minor:894 fsType:overlay blockSize:0} overlay_0-896:{mountpoint:/var/lib/containers/storage/overlay/2a7b131f6bac06df5e09f9eb28e226ac8129cfe9b93078382fbc549319aaea65/merged major:0 minor:896 fsType:overlay blockSize:0} overlay_0-905:{mountpoint:/var/lib/containers/storage/overlay/10997aa221f7e59cee70e83e83b10afc1c273ef3d5642ac1976859c94e9a4834/merged major:0 minor:905 fsType:overlay blockSize:0} overlay_0-909:{mountpoint:/var/lib/containers/storage/overlay/4533946df5d07bf636794fd6cac179433f6999382e379dec0f74dcc3c6086140/merged major:0 minor:909 fsType:overlay blockSize:0} overlay_0-912:{mountpoint:/var/lib/containers/storage/overlay/3f0782a877a7b25e45f26733d35825ce838bcc60c5caf2f3feb616600eb0254a/merged major:0 minor:912 fsType:overlay blockSize:0} overlay_0-914:{mountpoint:/var/lib/containers/storage/overlay/d46d9339f78bfbd601a8a84f2cb89cf2e1d2b6ff6ef6cde160d9c2fe56335016/merged major:0 minor:914 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/b5cd08f09e213444946b4c7689d877a5774adcdb085ca8ac30a287337d8fa8be/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-924:{mountpoint:/var/lib/containers/storage/overlay/937fe600c3f0e5e1e75c498e0101acb4cd16ed4327ccf0b8dcce6a87abceb932/merged major:0 minor:924 fsType:overlay blockSize:0} overlay_0-955:{mountpoint:/var/lib/containers/storage/overlay/f93fe421e09ff6c5b4368bb2521727b08f5f35de95ac55b5e8e06a7de207cd66/merged major:0 minor:955 fsType:overlay blockSize:0} overlay_0-957:{mountpoint:/var/lib/containers/storage/overlay/119f8a6ab6c76d0158bfb88d50b111e62e1646859fffef6818f26483ad36f108/merged major:0 minor:957 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/7e05c8d4ed21670f49bc5873cd201e10fc61a471f339caca199d8231504d36ae/merged major:0 minor:96 fsType:overlay blockSize:0} overlay_0-963:{mountpoint:/var/lib/containers/storage/overlay/d959d2da7db9a3639d2be1f16cf809d124dc70b235fcf55fba67b737ca53ba10/merged major:0 minor:963 fsType:overlay blockSize:0} overlay_0-979:{mountpoint:/var/lib/containers/storage/overlay/ab0ecb9c4693466e75b3b5077395cb24e12aa9030fe70b7da9e3daa7fe02f956/merged major:0 minor:979 fsType:overlay blockSize:0} overlay_0-984:{mountpoint:/var/lib/containers/storage/overlay/c199fc1a95e5241b7793a23880e288659aa06fcb95ae142c254c58f479e5d131/merged major:0 minor:984 fsType:overlay blockSize:0} overlay_0-987:{mountpoint:/var/lib/containers/storage/overlay/c08d842e4a77b695c223104138c6cb23f7719add342bc0d2a6887d6f072f079c/merged major:0 minor:987 fsType:overlay blockSize:0} overlay_0-989:{mountpoint:/var/lib/containers/storage/overlay/fc3f093ac74e5ef860155b88fa3e6d36ac238be53da5b5a9fded40495c21438f/merged major:0 minor:989 fsType:overlay blockSize:0} overlay_0-995:{mountpoint:/var/lib/containers/storage/overlay/e93a0d3ab1ffc0f69589c4198f56b8182f2027a30907a68fa5ae6af43e996f59/merged major:0 minor:995 fsType:overlay blockSize:0} overlay_0-997:{mountpoint:/var/lib/containers/storage/overlay/8ebcafdc3020ea3e624b7907e4389dfa0a92ed548998ab5c1a7a6931b13afafe/merged major:0 minor:997 fsType:overlay blockSize:0} overlay_0-999:{mountpoint:/var/lib/containers/storage/overlay/6f57e158998c352be9382c2fb520db1f8e29734d6ba293021a82d944d30fe5e8/merged major:0 minor:999 fsType:overlay blockSize:0}] Mar 13 10:41:32.431721 master-0 kubenswrapper[17876]: I0313 10:41:32.331032 17876 manager.go:217] Machine: {Timestamp:2026-03-13 10:41:32.325682121 +0000 UTC m=+0.161488647 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:3a49dbefec214e87acca6e8120215b7b SystemUUID:3a49dbef-ec21-4e87-acca-6e8120215b7b BootID:794a19f0-76ba-45e8-ae39-0211fb872ab6 Filesystems:[{Device:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:741 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/257ae542-4a06-42d3-b3e8-bf0a376494a8/volumes/kubernetes.io~projected/kube-api-access-fswp7 DeviceMajor:0 DeviceMinor:826 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1095 DeviceMajor:0 DeviceMinor:1095 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-667 DeviceMajor:0 DeviceMinor:667 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-346 DeviceMajor:0 DeviceMinor:346 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~projected/kube-api-access-lrmcp DeviceMajor:0 DeviceMinor:829 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8b07c5ae-1149-4031-bd92-6df4331e586c/volumes/kubernetes.io~projected/kube-api-access-4kn26 DeviceMajor:0 DeviceMinor:299 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-503 DeviceMajor:0 DeviceMinor:503 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~projected/kube-api-access-zscfc DeviceMajor:0 DeviceMinor:127 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/64cac6ba3a561adbc8f8770dc2f28e49933388f06613c25151f7bbd0ceb39107/userdata/shm DeviceMajor:0 DeviceMinor:455 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-118 DeviceMajor:0 DeviceMinor:118 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:842 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1126 DeviceMajor:0 DeviceMinor:1126 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-107 DeviceMajor:0 DeviceMinor:107 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:216 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251/userdata/shm DeviceMajor:0 DeviceMinor:265 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c/userdata/shm DeviceMajor:0 DeviceMinor:426 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-511 DeviceMajor:0 DeviceMinor:511 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-614 DeviceMajor:0 DeviceMinor:614 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-999 DeviceMajor:0 DeviceMinor:999 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-166 DeviceMajor:0 DeviceMinor:166 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/af7a768842b9cbb587f10537824efb3089e2d3b4f70fb674c1d644bca3af49d7/userdata/shm DeviceMajor:0 DeviceMinor:1041 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/258f571e-5ec8-42df-b4ba-17457d87d10d/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1034 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:381 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19f35bad4079f0b545148fd4db4666ab80db062f38092a6802b80cab4ec7982a/userdata/shm DeviceMajor:0 DeviceMinor:461 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-979 DeviceMajor:0 DeviceMinor:979 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~projected/kube-api-access-hwfd8 DeviceMajor:0 DeviceMinor:226 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-283 DeviceMajor:0 DeviceMinor:283 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-476 DeviceMajor:0 DeviceMinor:476 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~projected/kube-api-access-cvl4j DeviceMajor:0 DeviceMinor:804 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-810 DeviceMajor:0 DeviceMinor:810 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-661 DeviceMajor:0 DeviceMinor:661 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/277614e8-838f-4773-bcfc-89f19c620dee/volumes/kubernetes.io~projected/kube-api-access-jzvxz DeviceMajor:0 DeviceMinor:1038 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-144 DeviceMajor:0 DeviceMinor:144 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1d85f90b35c0a6fe94e4911c5e6e2a9798938c9acd1504a9008825c00646ea44/userdata/shm DeviceMajor:0 DeviceMinor:1070 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-643 DeviceMajor:0 DeviceMinor:643 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1004 DeviceMajor:0 DeviceMinor:1004 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff/userdata/shm DeviceMajor:0 DeviceMinor:974 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-361 DeviceMajor:0 DeviceMinor:361 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~projected/kube-api-access-2gcf6 DeviceMajor:0 DeviceMinor:438 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1060 DeviceMajor:0 DeviceMinor:1060 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-534 DeviceMajor:0 DeviceMinor:534 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-760 DeviceMajor:0 DeviceMinor:760 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~projected/kube-api-access-qr5lp DeviceMajor:0 DeviceMinor:317 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1043 DeviceMajor:0 DeviceMinor:1043 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~projected/kube-api-access-q6smf DeviceMajor:0 DeviceMinor:1081 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7c5279e3-0165-4347-bfc7-87b80accaab3/volumes/kubernetes.io~secret/kube-state-metrics-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:89 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327/userdata/shm DeviceMajor:0 DeviceMinor:275 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-279 DeviceMajor:0 DeviceMinor:279 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:428 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:429 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~projected/kube-api-access-4fcqg DeviceMajor:0 DeviceMinor:824 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-82 DeviceMajor:0 DeviceMinor:82 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:213 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-418 DeviceMajor:0 DeviceMinor:418 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:743 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-799 DeviceMajor:0 DeviceMinor:799 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b7090328-1191-4c7c-afed-603d7333014f/volumes/kubernetes.io~projected/kube-api-access-v9cxp DeviceMajor:0 DeviceMinor:976 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:726 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-41 DeviceMajor:0 DeviceMinor:41 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-472 DeviceMajor:0 DeviceMinor:472 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-688 DeviceMajor:0 DeviceMinor:688 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8/userdata/shm DeviceMajor:0 DeviceMinor:764 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-628 DeviceMajor:0 DeviceMinor:628 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-487 DeviceMajor:0 DeviceMinor:487 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e4a3a4a7895f019e0118f1584bc95eca1f9c60af18c9d3fe595f768be766c6d/userdata/shm DeviceMajor:0 DeviceMinor:457 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26320b73ca3fce1850dde3e75da5ccc58878b72f0f352ff1a9c176723a2b7d3d/userdata/shm DeviceMajor:0 DeviceMinor:459 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4a9a41f76fe188e7c2fc303922714d8a4a4540bbc426c47477e0dbcbe14a461c/userdata/shm DeviceMajor:0 DeviceMinor:619 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:598 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8988806dc69dce5b61c53cc2845447a33f520244d709f93fdb6f76499aee8916/userdata/shm DeviceMajor:0 DeviceMinor:808 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a0917212-59d8-4799-a9bc-52e358c5e8a0/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:828 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-633 DeviceMajor:0 DeviceMinor:633 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-409 DeviceMajor:0 DeviceMinor:409 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5/userdata/shm DeviceMajor:0 DeviceMinor:271 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-285 DeviceMajor:0 DeviceMinor:285 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-543 DeviceMajor:0 DeviceMinor:543 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4f70e184622d577e74124d1d17bc445ea80514437cbc221bcb9f2c6f012aa2ca/userdata/shm DeviceMajor:0 DeviceMinor:700 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-892 DeviceMajor:0 DeviceMinor:892 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-997 DeviceMajor:0 DeviceMinor:997 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1021 DeviceMajor:0 DeviceMinor:1021 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19/userdata/shm DeviceMajor:0 DeviceMinor:109 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-424 DeviceMajor:0 DeviceMinor:424 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2fe7b69e87a4fa6425da976dffbe87c8c66862e1127867967d8f83ef262d49b7/userdata/shm DeviceMajor:0 DeviceMinor:456 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd/userdata/shm DeviceMajor:0 DeviceMinor:620 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83/userdata/shm DeviceMajor:0 DeviceMinor:46 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:388 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/84f78350-e85c-4377-97cd-9e9a1b2ff4ee/volumes/kubernetes.io~projected/kube-api-access-d5v4b DeviceMajor:0 DeviceMinor:420 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:442 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:444 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-864 DeviceMajor:0 DeviceMinor:864 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:214 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9/userdata/shm DeviceMajor:0 DeviceMinor:308 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3e15f776-d153-4289-91c7-893584104185/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:413 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53/userdata/shm DeviceMajor:0 DeviceMinor:1047 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-tls DeviceMajor:0 DeviceMinor:967 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/58685de6-b4ae-4229-870b-5143a6010450/volumes/kubernetes.io~projected/kube-api-access-kn5nv DeviceMajor:0 DeviceMinor:260 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a7b698d2-f23a-4404-bc63-757ca549356f/volumes/kubernetes.io~projected/kube-api-access-zltcf DeviceMajor:0 DeviceMinor:303 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-350 DeviceMajor:0 DeviceMinor:350 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-515 DeviceMajor:0 DeviceMinor:515 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0496ccdf85c50cc91c17d6bf9ff564f60d26a99a551976f29e99ca9cd056f4fc/userdata/shm DeviceMajor:0 DeviceMinor:594 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~projected/kube-api-access-g8dpd DeviceMajor:0 DeviceMinor:307 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a9692d62aeb99fb7d4d3fc80637ffdf1ea3947790e26d640f42aacc16302c11/userdata/shm DeviceMajor:0 DeviceMinor:318 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:610 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~projected/kube-api-access-892f7 DeviceMajor:0 DeviceMinor:234 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-411 DeviceMajor:0 DeviceMinor:411 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:742 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-685 DeviceMajor:0 DeviceMinor:685 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-963 DeviceMajor:0 DeviceMinor:963 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-400 DeviceMajor:0 DeviceMinor:400 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~projected/kube-api-access-kd99t DeviceMajor:0 DeviceMinor:228 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1019 DeviceMajor:0 DeviceMinor:1019 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1053 DeviceMajor:0 DeviceMinor:1053 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fe76c4da023ee8241529e5f2a6a092dc48a1a51d30db462a00bc458437ba96ee/userdata/shm DeviceMajor:0 DeviceMinor:980 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:441 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-536 DeviceMajor:0 DeviceMinor:536 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f99b999c-4213-4d29-ab14-26c584e88445/volumes/kubernetes.io~projected/kube-api-access-bn7vq DeviceMajor:0 DeviceMinor:300 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-889 DeviceMajor:0 DeviceMinor:889 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~projected/kube-api-access-m5x2b DeviceMajor:0 DeviceMinor:155 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-987 DeviceMajor:0 DeviceMinor:987 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b/userdata/shm DeviceMajor:0 DeviceMinor:821 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f/userdata/shm DeviceMajor:0 DeviceMinor:837 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-658 DeviceMajor:0 DeviceMinor:658 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~projected/kube-api-access-s2kqq DeviceMajor:0 DeviceMinor:233 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-366 DeviceMajor:0 DeviceMinor:366 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e87ca16c-25de-4fea-b900-2960f4a5f95e/volumes/kubernetes.io~projected/kube-api-access-wrq5t DeviceMajor:0 DeviceMinor:247 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/53da2840-4a92-497a-a9d3-973583887147/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-540 DeviceMajor:0 DeviceMinor:540 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-592 DeviceMajor:0 DeviceMinor:592 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-703 DeviceMajor:0 DeviceMinor:703 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b02805e2-f186-4e59-bdfa-f4793263b468/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:789 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~projected/kube-api-access-mnnnp DeviceMajor:0 DeviceMinor:336 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b6406db9242e3599a9f6b43c6cc7f931a2398c12649757d5a331d9757d32028e/userdata/shm DeviceMajor:0 DeviceMinor:838 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa/userdata/shm DeviceMajor:0 DeviceMinor:1039 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~projected/kube-api-access-vmf6l DeviceMajor:0 DeviceMinor:978 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-912 DeviceMajor:0 DeviceMinor:912 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-722 DeviceMajor:0 DeviceMinor:722 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-519 DeviceMajor:0 DeviceMinor:519 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~projected/kube-api-access-twcrj DeviceMajor:0 DeviceMinor:241 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-369 DeviceMajor:0 DeviceMinor:369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2659c5a6a41b8bd57f0bf3c1da691ca647e461b974a89f7c9f8fe2c464e9654a/userdata/shm DeviceMajor:0 DeviceMinor:590 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:819 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-178 DeviceMajor:0 DeviceMinor:178 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fb060653-0d4b-4759-a7a1-c5dce194cce7/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bcfacb71ae88d504692e95ad77d6c9b51c2d2697daec2bf687474302cc5abf90/userdata/shm DeviceMajor:0 DeviceMinor:397 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-532 DeviceMajor:0 DeviceMinor:532 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/d9fd7b06-d61d-47c3-a08f-846245c79cc9/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:390 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190/userdata/shm DeviceMajor:0 DeviceMinor:462 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-914 DeviceMajor:0 DeviceMinor:914 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-293 DeviceMajor:0 DeviceMinor:293 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-291 DeviceMajor:0 DeviceMinor:291 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:440 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-812 DeviceMajor:0 DeviceMinor:812 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1116 DeviceMajor:0 DeviceMinor:1116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-281 DeviceMajor:0 DeviceMinor:281 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~projected/kube-api-access-f6fm9 DeviceMajor:0 DeviceMinor:103 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~projected/kube-api-access-nqrh5 DeviceMajor:0 DeviceMinor:449 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/390d92c6b1bf8de4d4ea48cb675d878d3b2cbd2b0311fc47e5e4feef80f55449/userdata/shm DeviceMajor:0 DeviceMinor:458 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2157cb66-d458-4353-bc9c-ef761e61e5c5/volumes/kubernetes.io~projected/kube-api-access-gntlk DeviceMajor:0 DeviceMinor:338 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5759216ebfee850b79609783445de8124c370c8bac5b63e2b5f03e38c742e1f0/userdata/shm DeviceMajor:0 DeviceMinor:718 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e95e82ba3152944d5f266f4315ecef6f288f0249fcf6dd92d242f6cd35eb008a/userdata/shm DeviceMajor:0 DeviceMinor:719 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-510 DeviceMajor:0 DeviceMinor:510 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d/userdata/shm DeviceMajor:0 DeviceMinor:1017 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-797 DeviceMajor:0 DeviceMinor:797 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1108 DeviceMajor:0 DeviceMinor:1108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-171 DeviceMajor:0 DeviceMinor:171 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c918fb3b270e41c6d62b6e571b5882afaab66a46ce66ce229de4e70f9853f259/userdata/shm DeviceMajor:0 DeviceMinor:392 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-482 DeviceMajor:0 DeviceMinor:482 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-538 DeviceMajor:0 DeviceMinor:538 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:555 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:556 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:831 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~projected/kube-api-access-xkwfv DeviceMajor:0 DeviceMinor:242 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-651 DeviceMajor:0 DeviceMinor:651 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5/userdata/shm DeviceMajor:0 DeviceMinor:993 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513/userdata/shm DeviceMajor:0 DeviceMinor:1045 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-564 DeviceMajor:0 DeviceMinor:564 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f/userdata/shm DeviceMajor:0 DeviceMinor:333 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a/userdata/shm DeviceMajor:0 DeviceMinor:105 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~projected/kube-api-access-htb49 DeviceMajor:0 DeviceMinor:332 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~projected/kube-api-access-6vbsc DeviceMajor:0 DeviceMinor:827 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1035 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-630 DeviceMajor:0 DeviceMinor:630 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-989 DeviceMajor:0 DeviceMinor:989 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1033 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-666 DeviceMajor:0 DeviceMinor:666 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-547 DeviceMajor:0 DeviceMinor:547 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3a72b45-a705-4335-9c04-c952ec5d9975/volumes/kubernetes.io~projected/kube-api-access-b5gkd DeviceMajor:0 DeviceMinor:560 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~projected/kube-api-access-6p29b DeviceMajor:0 DeviceMinor:766 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b090a7b841b2284b4a367b1fe9eb531751b92400aca909b51b87e9d7691a206c/userdata/shm DeviceMajor:0 DeviceMinor:776 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:830 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-749 DeviceMajor:0 DeviceMinor:749 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-525 DeviceMajor:0 DeviceMinor:525 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1055 DeviceMajor:0 DeviceMinor:1055 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-140 DeviceMajor:0 DeviceMinor:140 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:232 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/024d9bd3-ac77-4257-9808-7518f2a73e11/volumes/kubernetes.io~projected/kube-api-access-ffs2h DeviceMajor:0 DeviceMinor:237 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e7d31378-e940-4473-ab37-10f250c76666/volumes/kubernetes.io~projected/kube-api-access-b9768 DeviceMajor:0 DeviceMinor:239 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:445 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/97328e01-1227-417e-9af7-6426495d96db/volumes/kubernetes.io~projected/kube-api-access-ffmmr DeviceMajor:0 DeviceMinor:832 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~secret/prometheus-operator-kube-rbac-proxy-config DeviceMajor:0 DeviceMinor:971 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-750 DeviceMajor:0 DeviceMinor:750 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:761 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:608 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:223 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ba3e43ba-2840-4612-a370-87ad3c5a382a/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:225 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018/userdata/shm DeviceMajor:0 DeviceMinor:251 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2/userdata/shm DeviceMajor:0 DeviceMinor:844 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/ec33c506-8abe-4659-84d3-a294c31b446c/volumes/kubernetes.io~projected/kube-api-access-jk4qr DeviceMajor:0 DeviceMinor:450 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-474 DeviceMajor:0 DeviceMinor:474 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c/volumes/kubernetes.io~projected/kube-api-access-ppjzw DeviceMajor:0 DeviceMinor:437 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-635 DeviceMajor:0 DeviceMinor:635 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919/userdata/shm DeviceMajor:0 DeviceMinor:491 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-894 DeviceMajor:0 DeviceMinor:894 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-735 DeviceMajor:0 DeviceMinor:735 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:389 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e/volumes/kubernetes.io~projected/kube-api-access-5rqms DeviceMajor:0 DeviceMinor:1016 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-54 DeviceMajor:0 DeviceMinor:54 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~projected/kube-api-access-d72bw DeviceMajor:0 DeviceMinor:254 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-295 DeviceMajor:0 DeviceMinor:295 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-599 DeviceMajor:0 DeviceMinor:599 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2563ecb2-5783-4c45-a7f6-180e14e1c8c4/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:337 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9fb60bfa59d2ff40288f456815269ff4c838e82195edd334933c8654b4f8dedd/userdata/shm DeviceMajor:0 DeviceMinor:721 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-896 DeviceMajor:0 DeviceMinor:896 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~projected/kube-api-access-9c92k DeviceMajor:0 DeviceMinor:250 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-924 DeviceMajor:0 DeviceMinor:924 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b460735c-56aa-4dd3-a756-759859083e12/volumes/kubernetes.io~projected/kube-api-access-qr9x5 DeviceMajor:0 DeviceMinor:1036 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1049 DeviceMajor:0 DeviceMinor:1049 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a/userdata/shm DeviceMajor:0 DeviceMinor:972 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65/userdata/shm DeviceMajor:0 DeviceMinor:90 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1121 DeviceMajor:0 DeviceMinor:1121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c3bc64f22f8c58f9e978db84c7754f9ee2b132931d3190f29d081554cf105af/userdata/shm DeviceMajor:0 DeviceMinor:74 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes/kubernetes.io~projected/kube-api-access-4xqz6 DeviceMajor:0 DeviceMinor:763 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-656 DeviceMajor:0 DeviceMinor:656 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/77b4f8a8bc891942c93fc6bc58a70209e4d2685ce12294e206b71662186490b9/userdata/shm DeviceMajor:0 DeviceMinor:452 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-955 DeviceMajor:0 DeviceMinor:955 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-301 DeviceMajor:0 DeviceMinor:301 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-568 DeviceMajor:0 DeviceMinor:568 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes/kubernetes.io~projected/kube-api-access-xl7xt DeviceMajor:0 DeviceMinor:684 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-833 DeviceMajor:0 DeviceMinor:833 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-728 DeviceMajor:0 DeviceMinor:728 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1093 DeviceMajor:0 DeviceMinor:1093 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1023 DeviceMajor:0 DeviceMinor:1023 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-297 DeviceMajor:0 DeviceMinor:297 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~projected/kube-api-access-4nr6p DeviceMajor:0 DeviceMinor:611 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-649 DeviceMajor:0 DeviceMinor:649 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-781 DeviceMajor:0 DeviceMinor:781 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-323 DeviceMajor:0 DeviceMinor:323 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-116 DeviceMajor:0 DeviceMinor:116 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:85 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2/userdata/shm DeviceMajor:0 DeviceMinor:256 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a13f3e08-2b67-404f-8695-77aa17f92137/volumes/kubernetes.io~projected/kube-api-access-bzxzq DeviceMajor:0 DeviceMinor:231 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:398 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e3e74e8a6d87769b2b8f6bdae5a948fbb44f464be31e39d10a8d9e290f6b63c1/userdata/shm DeviceMajor:0 DeviceMinor:840 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-617 DeviceMajor:0 DeviceMinor:617 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-905 DeviceMajor:0 DeviceMinor:905 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1097 DeviceMajor:0 DeviceMinor:1097 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc/volumes/kubernetes.io~projected/kube-api-access-zjxp2 DeviceMajor:0 DeviceMinor:104 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/136e725a814882d97a92b91f392b5a4bb1498352a85819c564006fc0555c46b2/userdata/shm DeviceMajor:0 DeviceMinor:391 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0 Mar 13 10:41:32.432355 master-0 kubenswrapper[17876]: -430 DeviceMajor:0 DeviceMinor:430 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/646d9925ac7d679e5fe105dacc2e5ba2bf65b630c171bd0e095c89f902ecba0a/userdata/shm DeviceMajor:0 DeviceMinor:460 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:823 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-365 DeviceMajor:0 DeviceMinor:365 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0529b217-a9ef-48fb-b40a-b6789c640c20/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:102 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-321 DeviceMajor:0 DeviceMinor:321 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f872e59-1de1-4a95-8064-79696c73e8ab/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-289 DeviceMajor:0 DeviceMinor:289 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/db9faadf-74e9-4a7f-b3a6-902dd14ac978/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:432 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-769 DeviceMajor:0 DeviceMinor:769 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:814 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~projected/kube-api-access-8z5fj DeviceMajor:0 DeviceMinor:1037 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-159 DeviceMajor:0 DeviceMinor:159 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7748068f-7409-4972-81d2-84cfb52b7af0/volumes/kubernetes.io~projected/kube-api-access-ws7gk DeviceMajor:0 DeviceMinor:973 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-111 DeviceMajor:0 DeviceMinor:111 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:138 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6d81df6e0c2c501a006e6d355e7ca64b7f375686077a624175b4786dbf2e5138/userdata/shm DeviceMajor:0 DeviceMinor:746 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858/userdata/shm DeviceMajor:0 DeviceMinor:701 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350/userdata/shm DeviceMajor:0 DeviceMinor:245 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/edb84f3680f6b7a9122dea49c8ac75c4b3614e7e24eb119b118fbf82de0d5e2c/userdata/shm DeviceMajor:0 DeviceMinor:843 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-480 DeviceMajor:0 DeviceMinor:480 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1068 DeviceMajor:0 DeviceMinor:1068 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ecb5bdcc-647d-4292-a33d-dc3df331c206/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:224 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-562 DeviceMajor:0 DeviceMinor:562 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b57f1c19-f44a-4405-8135-79aef1d1ce07/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:335 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1082 DeviceMajor:0 DeviceMinor:1082 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-407 DeviceMajor:0 DeviceMinor:407 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fd91626c-38a8-462f-8bc0-96d57532de87/volumes/kubernetes.io~projected/kube-api-access-7mjm7 DeviceMajor:0 DeviceMinor:433 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/1ef32245-c238-43c6-a57a-a5ac95aff1f7/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:439 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d/userdata/shm DeviceMajor:0 DeviceMinor:835 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-75 DeviceMajor:0 DeviceMinor:75 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/77ae6dbbf39c4d2991c10b142e9d6fe23b3ada856897b7bc34aa3b7d69fa418b/userdata/shm DeviceMajor:0 DeviceMinor:322 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~projected/kube-api-access-tzdf2 DeviceMajor:0 DeviceMinor:230 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/0932314b-ccf5-4be5-99f8-b99886392daa/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-653 DeviceMajor:0 DeviceMinor:653 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/61427254-6722-4d1a-a96a-dadd24abbe94/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:825 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-627 DeviceMajor:0 DeviceMinor:627 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-692 DeviceMajor:0 DeviceMinor:692 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8d2fdba3-9478-4165-9207-d01483625607/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:98 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5/userdata/shm DeviceMajor:0 DeviceMinor:142 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-647 DeviceMajor:0 DeviceMinor:647 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-857 DeviceMajor:0 DeviceMinor:857 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-995 DeviceMajor:0 DeviceMinor:995 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1084 DeviceMajor:0 DeviceMinor:1084 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a3c91eef-ec46-419f-b418-ac3a8094b77d/volumes/kubernetes.io~projected/kube-api-access-b9l88 DeviceMajor:0 DeviceMinor:139 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/cf740515-d70d-44b6-ac00-21143b5494d1/volumes/kubernetes.io~projected/kube-api-access-6nfl8 DeviceMajor:0 DeviceMinor:229 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf/volumes/kubernetes.io~projected/kube-api-access-frmjp DeviceMajor:0 DeviceMinor:238 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524/userdata/shm DeviceMajor:0 DeviceMinor:246 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9/userdata/shm DeviceMajor:0 DeviceMinor:253 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-287 DeviceMajor:0 DeviceMinor:287 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-596 DeviceMajor:0 DeviceMinor:596 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/0881de70-2db3-4fc2-b976-b55c11dc239d/volumes/kubernetes.io~projected/kube-api-access-vjkdx DeviceMajor:0 DeviceMinor:820 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c7f667-d30e-41f4-8c0e-f3f138bffab4/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-851 DeviceMajor:0 DeviceMinor:851 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-780 DeviceMajor:0 DeviceMinor:780 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-493 DeviceMajor:0 DeviceMinor:493 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:304 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-854 DeviceMajor:0 DeviceMinor:854 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-664 DeviceMajor:0 DeviceMinor:664 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1076 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-984 DeviceMajor:0 DeviceMinor:984 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-70 DeviceMajor:0 DeviceMinor:70 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1f358d81-87c6-40bf-89e8-5681429285f8/volumes/kubernetes.io~projected/kube-api-access-rpnm8 DeviceMajor:0 DeviceMinor:235 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/6e69683c-59c5-43da-b105-ef2efb2d0a4e/volumes/kubernetes.io~projected/kube-api-access-wlmhs DeviceMajor:0 DeviceMinor:227 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:386 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-489 DeviceMajor:0 DeviceMinor:489 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-554 DeviceMajor:0 DeviceMinor:554 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-387 DeviceMajor:0 DeviceMinor:387 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8dc7af5f-ff72-4f06-88df-a26ff4c0bded/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:784 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1062 DeviceMajor:0 DeviceMinor:1062 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/cc66541c-6410-4824-b173-53747069429e/volumes/kubernetes.io~projected/kube-api-access-5p4cf DeviceMajor:0 DeviceMinor:115 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-1103 DeviceMajor:0 DeviceMinor:1103 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e34fa9d84124b6c127298dbbcc66ee1981c2d493a18d9fee5da615255d116cb0/userdata/shm DeviceMajor:0 DeviceMinor:395 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-405 DeviceMajor:0 DeviceMinor:405 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b5ed7aff-47c0-42f3-9a26-9385d2bde582/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:394 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/018c9219-d314-4408-ac39-93475d87eefb/volumes/kubernetes.io~projected/kube-api-access-v6lnq DeviceMajor:0 DeviceMinor:724 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/e4b55ebf-cab8-4985-95cc-b28bc5ae0578/volumes/kubernetes.io~projected/kube-api-access-chxxr DeviceMajor:0 DeviceMinor:320 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-957 DeviceMajor:0 DeviceMinor:957 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-199 DeviceMajor:0 DeviceMinor:199 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/03b97fde-467c-46f0-95f9-9c3820b4d790/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:443 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-785 DeviceMajor:0 DeviceMinor:785 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-114 DeviceMajor:0 DeviceMinor:114 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/161beda5-f575-4e60-8baa-5262a4fe86c7/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1077 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:221 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2/userdata/shm DeviceMajor:0 DeviceMinor:266 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/2c3e94d4-5c6d-4092-975c-e5bca49eb397/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:331 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-305 DeviceMajor:0 DeviceMinor:305 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/893dac15-d6d4-4a1f-988c-59aaf9e63334/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:244 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/17b956d3-c046-4f26-8be2-718c165a3acc/volumes/kubernetes.io~projected/kube-api-access-ch8qd DeviceMajor:0 DeviceMinor:240 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/25332da9-099c-4190-9e24-c19c86830a54/volumes/kubernetes.io~projected/kube-api-access-hqf9z DeviceMajor:0 DeviceMinor:243 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-403 DeviceMajor:0 DeviceMinor:403 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9258b0f-fdcc-4bfa-b982-5cf3c899c432/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:609 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-641 DeviceMajor:0 DeviceMinor:641 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1009 DeviceMajor:0 DeviceMinor:1009 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1029 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/193b3b95-f9a3-4272-853b-86366ce348a2/volumes/kubernetes.io~projected/kube-api-access-fvmjs DeviceMajor:0 DeviceMinor:125 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d/userdata/shm DeviceMajor:0 DeviceMinor:447 Capacity:67108864 Type:vfs Inodes:4108170 HasInodes:true} {Device:/var/lib/kubelet/pods/06ecac2e-bffa-474b-a824-9ba4a194159a/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:762 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true} {Device:overlay_0-767 DeviceMajor:0 DeviceMinor:767 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-618 DeviceMajor:0 DeviceMinor:618 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-909 DeviceMajor:0 DeviceMinor:909 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8df2728b-4f21-4aef-b31f-4197bbcd2728/volumes/kubernetes.io~projected/kube-api-access-74lr7 DeviceMajor:0 DeviceMinor:123 Capacity:32475529216 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0496ccdf85c50cc MacAddress:fe:3a:fa:97:4b:b9 Speed:10000 Mtu:8900} {Name:0731faf1ccc38c5 MacAddress:5a:eb:2c:de:95:9c Speed:10000 Mtu:8900} {Name:0914c9dfe834a27 MacAddress:02:98:42:56:82:dc Speed:10000 Mtu:8900} {Name:09bada5ccab47e8 MacAddress:8a:b6:92:88:c5:00 Speed:10000 Mtu:8900} {Name:136e725a814882d MacAddress:36:06:26:45:6e:4c Speed:10000 Mtu:8900} {Name:19f35bad4079f0b MacAddress:fa:33:fe:ef:06:cb Speed:10000 Mtu:8900} {Name:19fc005175f8b2f MacAddress:06:17:47:51:b3:cd Speed:10000 Mtu:8900} {Name:1ef234f61cea7c4 MacAddress:42:9a:13:6b:0b:64 Speed:10000 Mtu:8900} {Name:26320b73ca3fce1 MacAddress:42:1f:a7:c6:bc:9a Speed:10000 Mtu:8900} {Name:2e4a3a4a7895f01 MacAddress:7a:83:1f:33:98:b8 Speed:10000 Mtu:8900} {Name:2e7b5b751a85830 MacAddress:7a:8a:d0:0f:38:fc Speed:10000 Mtu:8900} {Name:2fe7b69e87a4fa6 MacAddress:6a:17:33:8e:85:4b Speed:10000 Mtu:8900} {Name:3471f8b061f6936 MacAddress:ea:dc:f5:56:6e:db Speed:10000 Mtu:8900} {Name:3570848357e5506 MacAddress:66:b0:e4:67:90:d5 Speed:10000 Mtu:8900} {Name:362b488b60e500e MacAddress:ae:2a:c8:e6:3c:3d Speed:10000 Mtu:8900} {Name:390d92c6b1bf8de MacAddress:da:39:66:95:84:de Speed:10000 Mtu:8900} {Name:3f70a6e48f4961d MacAddress:c6:12:eb:ea:40:3c Speed:10000 Mtu:8900} {Name:4a9a41f76fe188e MacAddress:e2:2b:ee:24:68:d1 Speed:10000 Mtu:8900} {Name:4f70e184622d577 MacAddress:96:99:c4:6a:09:9c Speed:10000 Mtu:8900} {Name:51b866160e4a9eb MacAddress:f6:42:a3:fb:ea:91 Speed:10000 Mtu:8900} {Name:5759216ebfee850 MacAddress:7e:d4:19:06:1b:2e Speed:10000 Mtu:8900} {Name:5f4e5674ade432e MacAddress:5a:7f:14:8c:dc:ee Speed:10000 Mtu:8900} {Name:646d9925ac7d679 MacAddress:4e:95:44:dd:d2:ed Speed:10000 Mtu:8900} {Name:64cac6ba3a561ad MacAddress:82:c1:1d:2a:37:90 Speed:10000 Mtu:8900} {Name:6d81df6e0c2c501 MacAddress:7a:70:e5:1b:c0:a8 Speed:10000 Mtu:8900} {Name:7502f9cc62ba09f MacAddress:5e:9f:92:3a:40:86 Speed:10000 Mtu:8900} {Name:7623887564e1fd2 MacAddress:56:8c:97:29:ac:9b Speed:10000 Mtu:8900} {Name:77ae6dbbf39c4d2 MacAddress:92:47:2a:b2:7b:35 Speed:10000 Mtu:8900} {Name:77b4f8a8bc89194 MacAddress:f6:f8:9c:e2:19:7b Speed:10000 Mtu:8900} {Name:7d64d717a487ab9 MacAddress:56:96:e5:8b:cb:15 Speed:10000 Mtu:8900} {Name:7d8988c40bcb4c1 MacAddress:d2:c3:5c:4a:13:33 Speed:10000 Mtu:8900} {Name:83984d61bee36a6 MacAddress:6a:ee:1f:75:29:26 Speed:10000 Mtu:8900} {Name:8988806dc69dce5 MacAddress:ea:2d:2b:a9:73:dc Speed:10000 Mtu:8900} {Name:8dcc826566dd71c MacAddress:72:09:99:aa:20:a8 Speed:10000 Mtu:8900} {Name:907b8fe5b1745c9 MacAddress:e2:d0:99:2c:9d:ef Speed:10000 Mtu:8900} {Name:9eb81fef2a10fda MacAddress:b6:af:fa:33:95:b2 Speed:10000 Mtu:8900} {Name:9fb60bfa59d2ff4 MacAddress:f2:67:33:70:90:c2 Speed:10000 Mtu:8900} {Name:a4d11bdc39191c7 MacAddress:d6:b1:9a:18:b0:56 Speed:10000 Mtu:8900} {Name:aba1a9619c2284c MacAddress:52:4e:6f:96:67:56 Speed:10000 Mtu:8900} {Name:b090a7b841b2284 MacAddress:1a:54:48:4f:97:d6 Speed:10000 Mtu:8900} {Name:b4158eeef011b1e MacAddress:e2:2d:a0:90:4f:8c Speed:10000 Mtu:8900} {Name:b6406db9242e359 MacAddress:4a:9b:1a:1a:4e:48 Speed:10000 Mtu:8900} {Name:bcfacb71ae88d50 MacAddress:96:c9:19:18:13:c9 Speed:10000 Mtu:8900} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:9000} {Name:br-int MacAddress:f6:ce:79:0c:e6:f0 Speed:0 Mtu:8900} {Name:c49cb5ec4e7e39a MacAddress:42:69:e8:c0:a1:b7 Speed:10000 Mtu:8900} {Name:c918fb3b270e41c MacAddress:ce:22:67:30:a8:49 Speed:10000 Mtu:8900} {Name:d3d43a9e0d6fcad MacAddress:6e:e9:82:a3:e9:c1 Speed:10000 Mtu:8900} {Name:da062cae7ba3072 MacAddress:ce:ed:e0:2d:91:4b Speed:10000 Mtu:8900} {Name:e34fa9d84124b6c MacAddress:d6:0c:42:bf:e6:fa Speed:10000 Mtu:8900} {Name:e3e74e8a6d87769 MacAddress:d6:3b:5f:f8:16:45 Speed:10000 Mtu:8900} {Name:e91ae8a44c4b4ac MacAddress:8e:e7:5f:88:ad:d5 Speed:10000 Mtu:8900} {Name:e95e82ba3152944 MacAddress:9a:1b:3f:29:df:bc Speed:10000 Mtu:8900} {Name:edb84f3680f6b7a MacAddress:22:f2:c6:2c:83:84 Speed:10000 Mtu:8900} {Name:eeb72465bb1427c MacAddress:7a:84:f0:4e:35:61 Speed:10000 Mtu:8900} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:9000} {Name:eth1 MacAddress:fa:16:3e:35:d5:aa Speed:-1 Mtu:9000} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:8900} {Name:ovs-system MacAddress:96:ef:67:d7:01:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 13 10:41:32.432355 master-0 kubenswrapper[17876]: I0313 10:41:32.432206 17876 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 13 10:41:32.432673 master-0 kubenswrapper[17876]: I0313 10:41:32.432480 17876 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433010 17876 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433246 17876 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433348 17876 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433774 17876 topology_manager.go:138] "Creating topology manager with none policy" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433794 17876 container_manager_linux.go:303] "Creating device plugin manager" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433856 17876 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.433912 17876 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434024 17876 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434169 17876 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434307 17876 kubelet.go:418] "Attempting to sync node with API server" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434333 17876 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434397 17876 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434427 17876 kubelet.go:324] "Adding apiserver pod source" Mar 13 10:41:32.434805 master-0 kubenswrapper[17876]: I0313 10:41:32.434457 17876 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 13 10:41:32.436484 master-0 kubenswrapper[17876]: I0313 10:41:32.436446 17876 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 13 10:41:32.436739 master-0 kubenswrapper[17876]: I0313 10:41:32.436707 17876 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 13 10:41:32.437268 master-0 kubenswrapper[17876]: I0313 10:41:32.437238 17876 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 13 10:41:32.437496 master-0 kubenswrapper[17876]: I0313 10:41:32.437474 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437498 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437506 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437512 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437518 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437525 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 13 10:41:32.437529 master-0 kubenswrapper[17876]: I0313 10:41:32.437531 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437539 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437546 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437553 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437568 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437580 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 13 10:41:32.437706 master-0 kubenswrapper[17876]: I0313 10:41:32.437621 17876 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 13 10:41:32.438562 master-0 kubenswrapper[17876]: I0313 10:41:32.438530 17876 server.go:1280] "Started kubelet" Mar 13 10:41:32.439247 master-0 kubenswrapper[17876]: I0313 10:41:32.439208 17876 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 13 10:41:32.439365 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 13 10:41:32.447326 master-0 kubenswrapper[17876]: I0313 10:41:32.441680 17876 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 13 10:41:32.447326 master-0 kubenswrapper[17876]: I0313 10:41:32.441919 17876 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 13 10:41:32.447326 master-0 kubenswrapper[17876]: I0313 10:41:32.442491 17876 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 13 10:41:32.447326 master-0 kubenswrapper[17876]: I0313 10:41:32.444658 17876 server.go:449] "Adding debug handlers to kubelet server" Mar 13 10:41:32.460541 master-0 kubenswrapper[17876]: I0313 10:41:32.460502 17876 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 13 10:41:32.460737 master-0 kubenswrapper[17876]: I0313 10:41:32.460557 17876 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 13 10:41:32.461295 master-0 kubenswrapper[17876]: I0313 10:41:32.461245 17876 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-14 10:25:32 +0000 UTC, rotation deadline is 2026-03-14 05:32:24.414877835 +0000 UTC Mar 13 10:41:32.461409 master-0 kubenswrapper[17876]: I0313 10:41:32.461385 17876 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h50m51.953498995s for next certificate rotation Mar 13 10:41:32.461528 master-0 kubenswrapper[17876]: I0313 10:41:32.461479 17876 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 13 10:41:32.461528 master-0 kubenswrapper[17876]: I0313 10:41:32.461520 17876 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 13 10:41:32.461904 master-0 kubenswrapper[17876]: I0313 10:41:32.461682 17876 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 13 10:41:32.461904 master-0 kubenswrapper[17876]: E0313 10:41:32.461687 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:32.465613 master-0 kubenswrapper[17876]: I0313 10:41:32.465158 17876 factory.go:55] Registering systemd factory Mar 13 10:41:32.465613 master-0 kubenswrapper[17876]: I0313 10:41:32.465194 17876 factory.go:221] Registration of the systemd container factory successfully Mar 13 10:41:32.465792 master-0 kubenswrapper[17876]: I0313 10:41:32.465773 17876 factory.go:153] Registering CRI-O factory Mar 13 10:41:32.465792 master-0 kubenswrapper[17876]: I0313 10:41:32.465790 17876 factory.go:221] Registration of the crio container factory successfully Mar 13 10:41:32.465899 master-0 kubenswrapper[17876]: I0313 10:41:32.465866 17876 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 13 10:41:32.465899 master-0 kubenswrapper[17876]: I0313 10:41:32.465893 17876 factory.go:103] Registering Raw factory Mar 13 10:41:32.465969 master-0 kubenswrapper[17876]: I0313 10:41:32.465907 17876 manager.go:1196] Started watching for new ooms in manager Mar 13 10:41:32.466579 master-0 kubenswrapper[17876]: I0313 10:41:32.466374 17876 manager.go:319] Starting recovery of all containers Mar 13 10:41:32.470402 master-0 kubenswrapper[17876]: E0313 10:41:32.470310 17876 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 13 10:41:32.482113 master-0 kubenswrapper[17876]: I0313 10:41:32.482015 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="97328e01-1227-417e-9af7-6426495d96db" volumeName="kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert" seLinuxMountContext="" Mar 13 10:41:32.482113 master-0 kubenswrapper[17876]: I0313 10:41:32.482080 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls" seLinuxMountContext="" Mar 13 10:41:32.482113 master-0 kubenswrapper[17876]: I0313 10:41:32.482095 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bb85e2-0d4a-418f-a7c9-482e8eafce19" volumeName="kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca" seLinuxMountContext="" Mar 13 10:41:32.482113 master-0 kubenswrapper[17876]: I0313 10:41:32.482111 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c5279e3-0165-4347-bfc7-87b80accaab3" volumeName="kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482121 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7090328-1191-4c7c-afed-603d7333014f" volumeName="kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482147 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482157 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec33c506-8abe-4659-84d3-a294c31b446c" volumeName="kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482167 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0529b217-a9ef-48fb-b40a-b6789c640c20" volumeName="kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482184 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482203 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482222 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca1b7c7-41af-46e9-8f5d-a476ee2b7587" volumeName="kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482238 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06ecac2e-bffa-474b-a824-9ba4a194159a" volumeName="kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482253 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" volumeName="kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482267 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58685de6-b4ae-4229-870b-5143a6010450" volumeName="kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482280 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dc7af5f-ff72-4f06-88df-a26ff4c0bded" volumeName="kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482289 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" volumeName="kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482305 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3a72b45-a705-4335-9c04-c952ec5d9975" volumeName="kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482316 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482326 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482339 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec33c506-8abe-4659-84d3-a294c31b446c" volumeName="kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482354 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b956d3-c046-4f26-8be2-718c165a3acc" volumeName="kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482373 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482383 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482392 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a13f3e08-2b67-404f-8695-77aa17f92137" volumeName="kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482407 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3a72b45-a705-4335-9c04-c952ec5d9975" volumeName="kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482417 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 10:41:32.482423 master-0 kubenswrapper[17876]: I0313 10:41:32.482433 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482455 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="97328e01-1227-417e-9af7-6426495d96db" volumeName="kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482467 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="277614e8-838f-4773-bcfc-89f19c620dee" volumeName="kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482477 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482492 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="257ae542-4a06-42d3-b3e8-bf0a376494a8" volumeName="kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482502 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" volumeName="kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482516 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b07c5ae-1149-4031-bd92-6df4331e586c" volumeName="kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482527 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482537 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482549 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5ed7aff-47c0-42f3-9a26-9385d2bde582" volumeName="kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482558 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482573 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bb85e2-0d4a-418f-a7c9-482e8eafce19" volumeName="kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482582 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482593 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61427254-6722-4d1a-a96a-dadd24abbe94" volumeName="kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482608 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61427254-6722-4d1a-a96a-dadd24abbe94" volumeName="kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482618 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c5279e3-0165-4347-bfc7-87b80accaab3" volumeName="kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482632 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="97328e01-1227-417e-9af7-6426495d96db" volumeName="kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482642 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482654 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e15f776-d153-4289-91c7-893584104185" volumeName="kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482672 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e15f776-d153-4289-91c7-893584104185" volumeName="kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482682 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482694 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" volumeName="kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482706 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c" volumeName="kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482716 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a13f3e08-2b67-404f-8695-77aa17f92137" volumeName="kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482729 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482739 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b57f1c19-f44a-4405-8135-79aef1d1ce07" volumeName="kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482758 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d2fdba3-9478-4165-9207-d01483625607" volumeName="kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482775 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dc7af5f-ff72-4f06-88df-a26ff4c0bded" volumeName="kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482787 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e" volumeName="kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482801 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482818 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0917212-59d8-4799-a9bc-52e358c5e8a0" volumeName="kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482829 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7090328-1191-4c7c-afed-603d7333014f" volumeName="kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482839 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3a72b45-a705-4335-9c04-c952ec5d9975" volumeName="kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482851 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482862 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99b999c-4213-4d29-ab14-26c584e88445" volumeName="kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482875 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482886 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" volumeName="kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482896 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" volumeName="kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482909 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482921 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fd91626c-38a8-462f-8bc0-96d57532de87" volumeName="kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482935 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="161beda5-f575-4e60-8baa-5262a4fe86c7" volumeName="kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482945 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2563ecb2-5783-4c45-a7f6-180e14e1c8c4" volumeName="kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482954 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" volumeName="kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482967 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cc66541c-6410-4824-b173-53747069429e" volumeName="kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.482977 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483014 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c3e94d4-5c6d-4092-975c-e5bca49eb397" volumeName="kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483024 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0881de70-2db3-4fc2-b976-b55c11dc239d" volumeName="kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483034 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b02805e2-f186-4e59-bdfa-f4793263b468" volumeName="kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483046 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483055 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483065 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483079 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" volumeName="kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483089 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c5279e3-0165-4347-bfc7-87b80accaab3" volumeName="kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483105 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db9faadf-74e9-4a7f-b3a6-902dd14ac978" volumeName="kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483114 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9ca1b7c7-41af-46e9-8f5d-a476ee2b7587" volumeName="kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483143 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0917212-59d8-4799-a9bc-52e358c5e8a0" volumeName="kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483157 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483169 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq" seLinuxMountContext="" Mar 13 10:41:32.483107 master-0 kubenswrapper[17876]: I0313 10:41:32.483181 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db9faadf-74e9-4a7f-b3a6-902dd14ac978" volumeName="kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483190 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4b55ebf-cab8-4985-95cc-b28bc5ae0578" volumeName="kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483202 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e87ca16c-25de-4fea-b900-2960f4a5f95e" volumeName="kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483217 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483234 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dc7af5f-ff72-4f06-88df-a26ff4c0bded" volumeName="kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483247 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483258 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" volumeName="kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483272 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7d31378-e940-4473-ab37-10f250c76666" volumeName="kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483294 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="024d9bd3-ac77-4257-9808-7518f2a73e11" volumeName="kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483305 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61427254-6722-4d1a-a96a-dadd24abbe94" volumeName="kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483329 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483339 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b02805e2-f186-4e59-bdfa-f4793263b468" volumeName="kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483354 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b02805e2-f186-4e59-bdfa-f4793263b468" volumeName="kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483372 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0881de70-2db3-4fc2-b976-b55c11dc239d" volumeName="kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483383 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483402 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b07c5ae-1149-4031-bd92-6df4331e586c" volumeName="kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483414 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" volumeName="kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483425 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483438 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483471 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483492 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5ed7aff-47c0-42f3-9a26-9385d2bde582" volumeName="kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483533 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483545 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483565 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b57f1c19-f44a-4405-8135-79aef1d1ce07" volumeName="kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483586 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8df2728b-4f21-4aef-b31f-4197bbcd2728" volumeName="kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483600 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483616 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99b999c-4213-4d29-ab14-26c584e88445" volumeName="kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483633 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483651 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2157cb66-d458-4353-bc9c-ef761e61e5c5" volumeName="kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483668 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db9faadf-74e9-4a7f-b3a6-902dd14ac978" volumeName="kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483683 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="58685de6-b4ae-4229-870b-5143a6010450" volumeName="kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483703 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a3c91eef-ec46-419f-b418-ac3a8094b77d" volumeName="kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483714 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0881de70-2db3-4fc2-b976-b55c11dc239d" volumeName="kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483752 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b956d3-c046-4f26-8be2-718c165a3acc" volumeName="kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483769 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61427254-6722-4d1a-a96a-dadd24abbe94" volumeName="kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483779 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483793 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0529b217-a9ef-48fb-b40a-b6789c640c20" volumeName="kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483804 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ef32245-c238-43c6-a57a-a5ac95aff1f7" volumeName="kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483821 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2563ecb2-5783-4c45-a7f6-180e14e1c8c4" volumeName="kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483839 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5da919b6-8545-4001-89f3-74cb289327f0" volumeName="kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483848 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e" volumeName="kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483860 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0917212-59d8-4799-a9bc-52e358c5e8a0" volumeName="kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483874 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7090328-1191-4c7c-afed-603d7333014f" volumeName="kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483884 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4b55ebf-cab8-4985-95cc-b28bc5ae0578" volumeName="kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483896 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03b97fde-467c-46f0-95f9-9c3820b4d790" volumeName="kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483907 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bb85e2-0d4a-418f-a7c9-482e8eafce19" volumeName="kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483920 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8d2fdba3-9478-4165-9207-d01483625607" volumeName="kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483929 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d9fd7b06-d61d-47c3-a08f-846245c79cc9" volumeName="kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483939 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="db9faadf-74e9-4a7f-b3a6-902dd14ac978" volumeName="kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483951 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483961 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483973 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="258f571e-5ec8-42df-b4ba-17457d87d10d" volumeName="kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.483984 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484000 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="893dac15-d6d4-4a1f-988c-59aaf9e63334" volumeName="kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484015 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484026 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7748068f-7409-4972-81d2-84cfb52b7af0" volumeName="kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484039 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484048 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484059 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="03b97fde-467c-46f0-95f9-9c3820b4d790" volumeName="kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484073 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484082 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ba3e43ba-2840-4612-a370-87ad3c5a382a" volumeName="kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484100 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="277614e8-838f-4773-bcfc-89f19c620dee" volumeName="kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484125 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8b07c5ae-1149-4031-bd92-6df4331e586c" volumeName="kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484136 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0881de70-2db3-4fc2-b976-b55c11dc239d" volumeName="kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484150 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="193b3b95-f9a3-4272-853b-86366ce348a2" volumeName="kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484161 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484174 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484185 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7c5279e3-0165-4347-bfc7-87b80accaab3" volumeName="kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484194 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484208 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b460735c-56aa-4dd3-a756-759859083e12" volumeName="kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484218 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484230 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484240 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0529b217-a9ef-48fb-b40a-b6789c640c20" volumeName="kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484250 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="21bb85e2-0d4a-418f-a7c9-482e8eafce19" volumeName="kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484264 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484274 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7748068f-7409-4972-81d2-84cfb52b7af0" volumeName="kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484287 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" volumeName="kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484298 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0917212-59d8-4799-a9bc-52e358c5e8a0" volumeName="kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484308 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2157cb66-d458-4353-bc9c-ef761e61e5c5" volumeName="kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484321 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484331 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484345 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="161beda5-f575-4e60-8baa-5262a4fe86c7" volumeName="kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484355 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484366 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" volumeName="kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484378 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e" volumeName="kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484387 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ef32245-c238-43c6-a57a-a5ac95aff1f7" volumeName="kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484399 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484410 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7d31378-e940-4473-ab37-10f250c76666" volumeName="kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484433 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484448 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="06ecac2e-bffa-474b-a824-9ba4a194159a" volumeName="kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484458 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484470 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="277614e8-838f-4773-bcfc-89f19c620dee" volumeName="kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484480 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" volumeName="kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484494 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="53da2840-4a92-497a-a9d3-973583887147" volumeName="kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484509 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484525 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2157cb66-d458-4353-bc9c-ef761e61e5c5" volumeName="kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484535 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484549 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8dc7af5f-ff72-4f06-88df-a26ff4c0bded" volumeName="kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484564 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="018c9219-d314-4408-ac39-93475d87eefb" volumeName="kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484590 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484610 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c3e94d4-5c6d-4092-975c-e5bca49eb397" volumeName="kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484627 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" volumeName="kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484642 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="94f7921a-6d0f-45b7-ba8f-9f2ef74b044e" volumeName="kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484651 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" volumeName="kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484664 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0932314b-ccf5-4be5-99f8-b99886392daa" volumeName="kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484675 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ec33c506-8abe-4659-84d3-a294c31b446c" volumeName="kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484693 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3e15f776-d153-4289-91c7-893584104185" volumeName="kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484713 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a7b698d2-f23a-4404-bc63-757ca549356f" volumeName="kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484730 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="257ae542-4a06-42d3-b3e8-bf0a376494a8" volumeName="kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484756 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="257ae542-4a06-42d3-b3e8-bf0a376494a8" volumeName="kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484773 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8df2728b-4f21-4aef-b31f-4197bbcd2728" volumeName="kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484789 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cf740515-d70d-44b6-ac00-21143b5494d1" volumeName="kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484802 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f99b999c-4213-4d29-ab14-26c584e88445" volumeName="kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484812 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484825 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="024d9bd3-ac77-4257-9808-7518f2a73e11" volumeName="kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484837 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="17b956d3-c046-4f26-8be2-718c165a3acc" volumeName="kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484856 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484869 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0881de70-2db3-4fc2-b976-b55c11dc239d" volumeName="kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484878 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7748068f-7409-4972-81d2-84cfb52b7af0" volumeName="kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484895 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25332da9-099c-4190-9e24-c19c86830a54" volumeName="kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484905 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f872e59-1de1-4a95-8064-79696c73e8ab" volumeName="kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484916 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484928 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2c3e94d4-5c6d-4092-975c-e5bca49eb397" volumeName="kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484937 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b7090328-1191-4c7c-afed-603d7333014f" volumeName="kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484963 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="161beda5-f575-4e60-8baa-5262a4fe86c7" volumeName="kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484980 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7748068f-7409-4972-81d2-84cfb52b7af0" volumeName="kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.484995 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ecb5bdcc-647d-4292-a33d-dc3df331c206" volumeName="kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.484901 master-0 kubenswrapper[17876]: I0313 10:41:32.485011 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" volumeName="kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485026 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5da919b6-8545-4001-89f3-74cb289327f0" volumeName="kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485044 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" volumeName="kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485058 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc" volumeName="kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485073 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1f358d81-87c6-40bf-89e8-5681429285f8" volumeName="kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485087 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" volumeName="kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485107 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6e69683c-59c5-43da-b105-ef2efb2d0a4e" volumeName="kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485195 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ef32245-c238-43c6-a57a-a5ac95aff1f7" volumeName="kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485213 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" volumeName="kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485228 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e4b55ebf-cab8-4985-95cc-b28bc5ae0578" volumeName="kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485243 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fb060653-0d4b-4759-a7a1-c5dce194cce7" volumeName="kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485254 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="97328e01-1227-417e-9af7-6426495d96db" volumeName="kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485274 17876 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b5ed7aff-47c0-42f3-9a26-9385d2bde582" volumeName="kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca" seLinuxMountContext="" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485290 17876 reconstruct.go:97] "Volume reconstruction finished" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.485304 17876 reconciler.go:26] "Reconciler: start to sync state" Mar 13 10:41:32.489734 master-0 kubenswrapper[17876]: I0313 10:41:32.487926 17876 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 13 10:41:32.492689 master-0 kubenswrapper[17876]: I0313 10:41:32.492655 17876 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 13 10:41:32.492746 master-0 kubenswrapper[17876]: I0313 10:41:32.492711 17876 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 13 10:41:32.492746 master-0 kubenswrapper[17876]: I0313 10:41:32.492742 17876 kubelet.go:2335] "Starting kubelet main sync loop" Mar 13 10:41:32.492833 master-0 kubenswrapper[17876]: E0313 10:41:32.492794 17876 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 13 10:41:32.505433 master-0 kubenswrapper[17876]: I0313 10:41:32.505378 17876 generic.go:334] "Generic (PLEG): container finished" podID="018c9219-d314-4408-ac39-93475d87eefb" containerID="74ae020ca7669fb01b80f8f98f454493cc6cfee0df109ea9dc9a0bb83ef979da" exitCode=0 Mar 13 10:41:32.507772 master-0 kubenswrapper[17876]: I0313 10:41:32.507725 17876 generic.go:334] "Generic (PLEG): container finished" podID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerID="39e3998474ffa5421ada785b69659b745abc434915dc0302700b2f60923ba978" exitCode=0 Mar 13 10:41:32.516594 master-0 kubenswrapper[17876]: I0313 10:41:32.516531 17876 generic.go:334] "Generic (PLEG): container finished" podID="5da919b6-8545-4001-89f3-74cb289327f0" containerID="a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4" exitCode=0 Mar 13 10:41:32.524868 master-0 kubenswrapper[17876]: I0313 10:41:32.524766 17876 generic.go:334] "Generic (PLEG): container finished" podID="3b44838d-cfe0-42fe-9927-d0b5391eee81" containerID="4f57dbde7e6dd83a3f45d28b694622a3cd36e451a3d2e531b974cdf91eee3a45" exitCode=0 Mar 13 10:41:32.540336 master-0 kubenswrapper[17876]: I0313 10:41:32.540255 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/0.log" Mar 13 10:41:32.540336 master-0 kubenswrapper[17876]: I0313 10:41:32.540337 17876 generic.go:334] "Generic (PLEG): container finished" podID="ec33c506-8abe-4659-84d3-a294c31b446c" containerID="eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946" exitCode=1 Mar 13 10:41:32.542447 master-0 kubenswrapper[17876]: I0313 10:41:32.542412 17876 generic.go:334] "Generic (PLEG): container finished" podID="a9258b0f-fdcc-4bfa-b982-5cf3c899c432" containerID="4bdeab3ebfebb7845458ea9c29cbf7443ef96922911395dc3575274a6c5d9316" exitCode=0 Mar 13 10:41:32.545994 master-0 kubenswrapper[17876]: I0313 10:41:32.545941 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/0.log" Mar 13 10:41:32.546076 master-0 kubenswrapper[17876]: I0313 10:41:32.545992 17876 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e" exitCode=1 Mar 13 10:41:32.556259 master-0 kubenswrapper[17876]: I0313 10:41:32.556204 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d" exitCode=0 Mar 13 10:41:32.561856 master-0 kubenswrapper[17876]: E0313 10:41:32.561818 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:32.565012 master-0 kubenswrapper[17876]: I0313 10:41:32.564951 17876 generic.go:334] "Generic (PLEG): container finished" podID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerID="a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f" exitCode=0 Mar 13 10:41:32.567354 master-0 kubenswrapper[17876]: I0313 10:41:32.567293 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_1e86a3b0-37b3-4df1-a522-f29cda076753/installer/0.log" Mar 13 10:41:32.567419 master-0 kubenswrapper[17876]: I0313 10:41:32.567349 17876 generic.go:334] "Generic (PLEG): container finished" podID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerID="d19b978c1e8101a0212df3b6611d9d31aa1e8b34d80df670a9b5c7dd94abdbf2" exitCode=1 Mar 13 10:41:32.569325 master-0 kubenswrapper[17876]: I0313 10:41:32.569273 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/0.log" Mar 13 10:41:32.569708 master-0 kubenswrapper[17876]: I0313 10:41:32.569667 17876 generic.go:334] "Generic (PLEG): container finished" podID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerID="84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c" exitCode=1 Mar 13 10:41:32.570837 master-0 kubenswrapper[17876]: I0313 10:41:32.570806 17876 generic.go:334] "Generic (PLEG): container finished" podID="994d29a3-98d8-45bd-8922-adcdc899b632" containerID="ccc3b2c6e99cb63369120234f78e03c40f7502629397be2489760d94a1bdc974" exitCode=0 Mar 13 10:41:32.582039 master-0 kubenswrapper[17876]: I0313 10:41:32.581997 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-xvxcr_f8c7f667-d30e-41f4-8c0e-f3f138bffab4/cluster-olm-operator/0.log" Mar 13 10:41:32.582930 master-0 kubenswrapper[17876]: I0313 10:41:32.582868 17876 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660" exitCode=255 Mar 13 10:41:32.582974 master-0 kubenswrapper[17876]: I0313 10:41:32.582913 17876 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="136407fc6ee546951641a1123b4e37b22c08b30eef90bafae91497fd8eca613e" exitCode=0 Mar 13 10:41:32.582974 master-0 kubenswrapper[17876]: I0313 10:41:32.582958 17876 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="57e72688ac44b6f412bc80bc5d4c7d9672ed6ce81db27dd8e0ee399b42f61ca3" exitCode=0 Mar 13 10:41:32.584733 master-0 kubenswrapper[17876]: I0313 10:41:32.584699 17876 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe" exitCode=1 Mar 13 10:41:32.592922 master-0 kubenswrapper[17876]: E0313 10:41:32.592873 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:32.593078 master-0 kubenswrapper[17876]: I0313 10:41:32.592940 17876 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" exitCode=0 Mar 13 10:41:32.596520 master-0 kubenswrapper[17876]: I0313 10:41:32.596425 17876 generic.go:334] "Generic (PLEG): container finished" podID="257ae542-4a06-42d3-b3e8-bf0a376494a8" containerID="799c00d706ab085bdece95573540241444c883e9ee37d48b06d60922afea2895" exitCode=0 Mar 13 10:41:32.596520 master-0 kubenswrapper[17876]: I0313 10:41:32.596461 17876 generic.go:334] "Generic (PLEG): container finished" podID="257ae542-4a06-42d3-b3e8-bf0a376494a8" containerID="18b792e5b93f77cf52a60082b53bff347c1fb4352f7afe19baba67d3e0c88848" exitCode=0 Mar 13 10:41:32.600969 master-0 kubenswrapper[17876]: I0313 10:41:32.600926 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/0.log" Mar 13 10:41:32.601076 master-0 kubenswrapper[17876]: I0313 10:41:32.600984 17876 generic.go:334] "Generic (PLEG): container finished" podID="0881de70-2db3-4fc2-b976-b55c11dc239d" containerID="d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829" exitCode=1 Mar 13 10:41:32.602798 master-0 kubenswrapper[17876]: I0313 10:41:32.602765 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager-operator_kube-controller-manager-operator-86d7cdfdfb-z7h4j_53da2840-4a92-497a-a9d3-973583887147/kube-controller-manager-operator/1.log" Mar 13 10:41:32.602855 master-0 kubenswrapper[17876]: I0313 10:41:32.602807 17876 generic.go:334] "Generic (PLEG): container finished" podID="53da2840-4a92-497a-a9d3-973583887147" containerID="023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493" exitCode=255 Mar 13 10:41:32.605220 master-0 kubenswrapper[17876]: I0313 10:41:32.605179 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-b2ss8_cf740515-d70d-44b6-ac00-21143b5494d1/ingress-operator/0.log" Mar 13 10:41:32.605286 master-0 kubenswrapper[17876]: I0313 10:41:32.605225 17876 generic.go:334] "Generic (PLEG): container finished" podID="cf740515-d70d-44b6-ac00-21143b5494d1" containerID="1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748" exitCode=1 Mar 13 10:41:32.611058 master-0 kubenswrapper[17876]: I0313 10:41:32.610981 17876 generic.go:334] "Generic (PLEG): container finished" podID="2157cb66-d458-4353-bc9c-ef761e61e5c5" containerID="6b4220f271a2b153bc0e77946705d348742d71cd7644e3f17d99cbdeff70f16f" exitCode=0 Mar 13 10:41:32.611058 master-0 kubenswrapper[17876]: I0313 10:41:32.611042 17876 generic.go:334] "Generic (PLEG): container finished" podID="2157cb66-d458-4353-bc9c-ef761e61e5c5" containerID="9af6352032b6a53c8275f34292597e82151238d6d1e06b053ba0617d04ed63ea" exitCode=0 Mar 13 10:41:32.613540 master-0 kubenswrapper[17876]: I0313 10:41:32.613500 17876 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3" exitCode=0 Mar 13 10:41:32.613540 master-0 kubenswrapper[17876]: I0313 10:41:32.613535 17876 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="c5e876296b0a2729a3344c97bacebf2dce95059710f134fefa8e83abca942e51" exitCode=0 Mar 13 10:41:32.638840 master-0 kubenswrapper[17876]: I0313 10:41:32.638788 17876 generic.go:334] "Generic (PLEG): container finished" podID="0932314b-ccf5-4be5-99f8-b99886392daa" containerID="b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe" exitCode=0 Mar 13 10:41:32.646975 master-0 kubenswrapper[17876]: I0313 10:41:32.646923 17876 generic.go:334] "Generic (PLEG): container finished" podID="f99b999c-4213-4d29-ab14-26c584e88445" containerID="58d7404b838e4c314c4bb71f4fca18a37f75d33d03431ce85b9c2b50d05d498a" exitCode=0 Mar 13 10:41:32.646975 master-0 kubenswrapper[17876]: I0313 10:41:32.646964 17876 generic.go:334] "Generic (PLEG): container finished" podID="f99b999c-4213-4d29-ab14-26c584e88445" containerID="b5d9c7e0055ba7e94e605d53781c97326170e75e394826099511e568c7ceef53" exitCode=0 Mar 13 10:41:32.653635 master-0 kubenswrapper[17876]: I0313 10:41:32.653587 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ce2b6ceda0b8c8212b1e35589d611accb6e40391c87b39cfb64f98a22b7e5dda" exitCode=0 Mar 13 10:41:32.653635 master-0 kubenswrapper[17876]: I0313 10:41:32.653633 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c08b2c581358381ac2f0c793ddf6295e272c0061c1b2d6e05d6e5ab7c2a5729b" exitCode=0 Mar 13 10:41:32.653750 master-0 kubenswrapper[17876]: I0313 10:41:32.653643 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="ffc23a177a087ad146cddc2bc253947b08886f41c707f8ee47efc6dd4d3c5c8e" exitCode=0 Mar 13 10:41:32.653750 master-0 kubenswrapper[17876]: I0313 10:41:32.653651 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="dc84ce423f666bcd523a540ff225040b69d4425d2faf8d523c79672591bd3375" exitCode=0 Mar 13 10:41:32.653750 master-0 kubenswrapper[17876]: I0313 10:41:32.653657 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="c0bf4ee121253f4acc846c62a0fe4a189d6104b07034617c1152a5f95507935c" exitCode=0 Mar 13 10:41:32.653750 master-0 kubenswrapper[17876]: I0313 10:41:32.653664 17876 generic.go:334] "Generic (PLEG): container finished" podID="cc66541c-6410-4824-b173-53747069429e" containerID="8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf" exitCode=0 Mar 13 10:41:32.657181 master-0 kubenswrapper[17876]: I0313 10:41:32.657143 17876 generic.go:334] "Generic (PLEG): container finished" podID="6e69683c-59c5-43da-b105-ef2efb2d0a4e" containerID="9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2" exitCode=0 Mar 13 10:41:32.659746 master-0 kubenswrapper[17876]: I0313 10:41:32.659697 17876 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d" exitCode=0 Mar 13 10:41:32.659746 master-0 kubenswrapper[17876]: I0313 10:41:32.659736 17876 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba" exitCode=0 Mar 13 10:41:32.659746 master-0 kubenswrapper[17876]: I0313 10:41:32.659747 17876 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1" exitCode=0 Mar 13 10:41:32.662598 master-0 kubenswrapper[17876]: E0313 10:41:32.662562 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:32.670150 master-0 kubenswrapper[17876]: I0313 10:41:32.670101 17876 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="8f137541b8024be9dec3a0e2a3bb479dfd8210f470244154f734979cdb98e7ff" exitCode=0 Mar 13 10:41:32.680969 master-0 kubenswrapper[17876]: I0313 10:41:32.680931 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_packageserver-85b658d7fb-45fq6_97328e01-1227-417e-9af7-6426495d96db/packageserver/0.log" Mar 13 10:41:32.681161 master-0 kubenswrapper[17876]: I0313 10:41:32.681002 17876 generic.go:334] "Generic (PLEG): container finished" podID="97328e01-1227-417e-9af7-6426495d96db" containerID="c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb" exitCode=2 Mar 13 10:41:32.685551 master-0 kubenswrapper[17876]: I0313 10:41:32.684427 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/0.log" Mar 13 10:41:32.685551 master-0 kubenswrapper[17876]: I0313 10:41:32.684564 17876 generic.go:334] "Generic (PLEG): container finished" podID="06ecac2e-bffa-474b-a824-9ba4a194159a" containerID="77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490" exitCode=1 Mar 13 10:41:32.686319 master-0 kubenswrapper[17876]: I0313 10:41:32.686274 17876 generic.go:334] "Generic (PLEG): container finished" podID="d04e4749-2b79-49e2-a451-a2733443a913" containerID="6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8" exitCode=1 Mar 13 10:41:32.687936 master-0 kubenswrapper[17876]: I0313 10:41:32.687906 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/0.log" Mar 13 10:41:32.688427 master-0 kubenswrapper[17876]: I0313 10:41:32.688262 17876 generic.go:334] "Generic (PLEG): container finished" podID="a3c91eef-ec46-419f-b418-ac3a8094b77d" containerID="d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248" exitCode=1 Mar 13 10:41:32.692253 master-0 kubenswrapper[17876]: I0313 10:41:32.692221 17876 generic.go:334] "Generic (PLEG): container finished" podID="2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf" containerID="e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e" exitCode=0 Mar 13 10:41:32.694366 master-0 kubenswrapper[17876]: I0313 10:41:32.694332 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-wz9t2_b57f1c19-f44a-4405-8135-79aef1d1ce07/cluster-storage-operator/0.log" Mar 13 10:41:32.694366 master-0 kubenswrapper[17876]: I0313 10:41:32.694365 17876 generic.go:334] "Generic (PLEG): container finished" podID="b57f1c19-f44a-4405-8135-79aef1d1ce07" containerID="6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea" exitCode=255 Mar 13 10:41:32.701791 master-0 kubenswrapper[17876]: I0313 10:41:32.701738 17876 generic.go:334] "Generic (PLEG): container finished" podID="193b3b95-f9a3-4272-853b-86366ce348a2" containerID="ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac" exitCode=0 Mar 13 10:41:32.705269 master-0 kubenswrapper[17876]: I0313 10:41:32.705243 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c834b554-c652-4f45-9110-3d4e260ba98a/installer/0.log" Mar 13 10:41:32.705411 master-0 kubenswrapper[17876]: I0313 10:41:32.705280 17876 generic.go:334] "Generic (PLEG): container finished" podID="c834b554-c652-4f45-9110-3d4e260ba98a" containerID="bda6d571a69475cffe984e819a7cc51ddb710348cfb7bd2636c19986e3e1d5ca" exitCode=1 Mar 13 10:41:32.712953 master-0 kubenswrapper[17876]: I0313 10:41:32.712920 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 13 10:41:32.713374 master-0 kubenswrapper[17876]: I0313 10:41:32.713332 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3" exitCode=1 Mar 13 10:41:32.713428 master-0 kubenswrapper[17876]: I0313 10:41:32.713375 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6" exitCode=0 Mar 13 10:41:32.729677 master-0 kubenswrapper[17876]: I0313 10:41:32.729622 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" exitCode=0 Mar 13 10:41:32.729677 master-0 kubenswrapper[17876]: I0313 10:41:32.729655 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" exitCode=0 Mar 13 10:41:32.729677 master-0 kubenswrapper[17876]: I0313 10:41:32.729663 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" exitCode=0 Mar 13 10:41:32.744555 master-0 kubenswrapper[17876]: I0313 10:41:32.744499 17876 generic.go:334] "Generic (PLEG): container finished" podID="fb060653-0d4b-4759-a7a1-c5dce194cce7" containerID="f741ec84eccfaea3008e82066654cae2f174abb120ece50ffb0345c3a6b62422" exitCode=0 Mar 13 10:41:32.748878 master-0 kubenswrapper[17876]: I0313 10:41:32.748834 17876 generic.go:334] "Generic (PLEG): container finished" podID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerID="34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104" exitCode=0 Mar 13 10:41:32.751304 master-0 kubenswrapper[17876]: I0313 10:41:32.751271 17876 generic.go:334] "Generic (PLEG): container finished" podID="9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906" containerID="acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e" exitCode=0 Mar 13 10:41:32.755335 master-0 kubenswrapper[17876]: I0313 10:41:32.755250 17876 generic.go:334] "Generic (PLEG): container finished" podID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerID="00a4f5e044b3bb37309a0058cc340985271f0a9be303d372e70635d4947090aa" exitCode=0 Mar 13 10:41:32.756867 master-0 kubenswrapper[17876]: I0313 10:41:32.756841 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-z9wrg_8d2fdba3-9478-4165-9207-d01483625607/network-operator/1.log" Mar 13 10:41:32.756980 master-0 kubenswrapper[17876]: I0313 10:41:32.756885 17876 generic.go:334] "Generic (PLEG): container finished" podID="8d2fdba3-9478-4165-9207-d01483625607" containerID="1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258" exitCode=255 Mar 13 10:41:32.758714 master-0 kubenswrapper[17876]: I0313 10:41:32.758680 17876 generic.go:334] "Generic (PLEG): container finished" podID="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" containerID="2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba" exitCode=0 Mar 13 10:41:32.760682 master-0 kubenswrapper[17876]: I0313 10:41:32.760655 17876 generic.go:334] "Generic (PLEG): container finished" podID="8b07c5ae-1149-4031-bd92-6df4331e586c" containerID="fc95bff32f2114b905d9fbe18892b7b039189a377e939c5fcb424714913dd15f" exitCode=0 Mar 13 10:41:32.760682 master-0 kubenswrapper[17876]: I0313 10:41:32.760673 17876 generic.go:334] "Generic (PLEG): container finished" podID="8b07c5ae-1149-4031-bd92-6df4331e586c" containerID="053d1c527d639c6703a290ef72056a864dded275336f60631cf170ecafc6976b" exitCode=0 Mar 13 10:41:32.762674 master-0 kubenswrapper[17876]: I0313 10:41:32.762586 17876 generic.go:334] "Generic (PLEG): container finished" podID="893dac15-d6d4-4a1f-988c-59aaf9e63334" containerID="32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0" exitCode=0 Mar 13 10:41:32.762674 master-0 kubenswrapper[17876]: E0313 10:41:32.762644 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:32.768920 master-0 kubenswrapper[17876]: I0313 10:41:32.768887 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_6b488263-6a56-439c-945e-926936ed049d/installer/0.log" Mar 13 10:41:32.769046 master-0 kubenswrapper[17876]: I0313 10:41:32.768926 17876 generic.go:334] "Generic (PLEG): container finished" podID="6b488263-6a56-439c-945e-926936ed049d" containerID="cfc30e3ed734f4cb74033d3d0ab50e918052fd74c62e5f4931d21fcdfbcbd074" exitCode=1 Mar 13 10:41:32.792992 master-0 kubenswrapper[17876]: E0313 10:41:32.792949 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:32.863396 master-0 kubenswrapper[17876]: E0313 10:41:32.863343 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:32.963577 master-0 kubenswrapper[17876]: E0313 10:41:32.963456 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.063813 master-0 kubenswrapper[17876]: E0313 10:41:33.063727 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.167157 master-0 kubenswrapper[17876]: E0313 10:41:33.166263 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.205475 master-0 kubenswrapper[17876]: E0313 10:41:33.193211 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:33.267021 master-0 kubenswrapper[17876]: E0313 10:41:33.266875 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.367779 master-0 kubenswrapper[17876]: E0313 10:41:33.367435 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.468248 master-0 kubenswrapper[17876]: E0313 10:41:33.468197 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.569317 master-0 kubenswrapper[17876]: E0313 10:41:33.569199 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.669818 master-0 kubenswrapper[17876]: E0313 10:41:33.669766 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.816903 master-0 kubenswrapper[17876]: E0313 10:41:33.815345 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.915692 master-0 kubenswrapper[17876]: E0313 10:41:33.915585 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:33.993620 master-0 kubenswrapper[17876]: E0313 10:41:33.993545 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:34.015978 master-0 kubenswrapper[17876]: E0313 10:41:34.015896 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.116784 master-0 kubenswrapper[17876]: E0313 10:41:34.116645 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.217610 master-0 kubenswrapper[17876]: E0313 10:41:34.217465 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.318472 master-0 kubenswrapper[17876]: E0313 10:41:34.318344 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.418795 master-0 kubenswrapper[17876]: E0313 10:41:34.418743 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.519013 master-0 kubenswrapper[17876]: E0313 10:41:34.518874 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.619650 master-0 kubenswrapper[17876]: E0313 10:41:34.619536 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.720261 master-0 kubenswrapper[17876]: E0313 10:41:34.720160 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.821222 master-0 kubenswrapper[17876]: E0313 10:41:34.820952 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:34.921896 master-0 kubenswrapper[17876]: E0313 10:41:34.921774 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.022809 master-0 kubenswrapper[17876]: E0313 10:41:35.022721 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.123448 master-0 kubenswrapper[17876]: E0313 10:41:35.123345 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.223944 master-0 kubenswrapper[17876]: E0313 10:41:35.223872 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.325049 master-0 kubenswrapper[17876]: E0313 10:41:35.324949 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.426091 master-0 kubenswrapper[17876]: E0313 10:41:35.425881 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.526294 master-0 kubenswrapper[17876]: E0313 10:41:35.526184 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.593859 master-0 kubenswrapper[17876]: E0313 10:41:35.593727 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:35.627379 master-0 kubenswrapper[17876]: E0313 10:41:35.627249 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.727846 master-0 kubenswrapper[17876]: E0313 10:41:35.727613 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.828680 master-0 kubenswrapper[17876]: E0313 10:41:35.828597 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:35.929792 master-0 kubenswrapper[17876]: E0313 10:41:35.929693 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.030146 master-0 kubenswrapper[17876]: E0313 10:41:36.029923 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.130708 master-0 kubenswrapper[17876]: E0313 10:41:36.130622 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.231455 master-0 kubenswrapper[17876]: E0313 10:41:36.231368 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.332059 master-0 kubenswrapper[17876]: E0313 10:41:36.331904 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.432491 master-0 kubenswrapper[17876]: E0313 10:41:36.432433 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.532692 master-0 kubenswrapper[17876]: E0313 10:41:36.532616 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.633089 master-0 kubenswrapper[17876]: E0313 10:41:36.633043 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.733797 master-0 kubenswrapper[17876]: E0313 10:41:36.733681 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.878503 master-0 kubenswrapper[17876]: E0313 10:41:36.834223 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:36.936062 master-0 kubenswrapper[17876]: E0313 10:41:36.935604 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.036058 master-0 kubenswrapper[17876]: E0313 10:41:37.035988 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.136495 master-0 kubenswrapper[17876]: E0313 10:41:37.136443 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.237437 master-0 kubenswrapper[17876]: E0313 10:41:37.237280 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.337680 master-0 kubenswrapper[17876]: E0313 10:41:37.337630 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.438509 master-0 kubenswrapper[17876]: E0313 10:41:37.438453 17876 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 13 10:41:37.456914 master-0 kubenswrapper[17876]: I0313 10:41:37.456855 17876 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:41:37.457573 master-0 kubenswrapper[17876]: I0313 10:41:37.457536 17876 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:41:37.466401 master-0 kubenswrapper[17876]: I0313 10:41:37.466357 17876 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:41:37.492250 master-0 kubenswrapper[17876]: I0313 10:41:37.492129 17876 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 13 10:41:37.495740 master-0 kubenswrapper[17876]: I0313 10:41:37.495698 17876 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:41:38.437878 master-0 kubenswrapper[17876]: I0313 10:41:38.437809 17876 apiserver.go:52] "Watching apiserver" Mar 13 10:41:38.529142 master-0 kubenswrapper[17876]: I0313 10:41:38.528308 17876 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:41:38.794504 master-0 kubenswrapper[17876]: E0313 10:41:38.794366 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:43.795461 master-0 kubenswrapper[17876]: E0313 10:41:43.795367 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:48.796606 master-0 kubenswrapper[17876]: E0313 10:41:48.796489 17876 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 13 10:41:50.009645 master-0 kubenswrapper[17876]: I0313 10:41:50.009592 17876 manager.go:324] Recovery completed Mar 13 10:41:50.216748 master-0 kubenswrapper[17876]: I0313 10:41:50.212871 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-6gzxr_5da919b6-8545-4001-89f3-74cb289327f0/multus-admission-controller/0.log" Mar 13 10:41:50.216748 master-0 kubenswrapper[17876]: I0313 10:41:50.212959 17876 generic.go:334] "Generic (PLEG): container finished" podID="5da919b6-8545-4001-89f3-74cb289327f0" containerID="2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3" exitCode=137 Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224186 17876 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224218 17876 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224280 17876 state_mem.go:36] "Initialized new in-memory state store" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224586 17876 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224602 17876 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224643 17876 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224652 17876 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 13 10:41:50.225899 master-0 kubenswrapper[17876]: I0313 10:41:50.224677 17876 policy_none.go:49] "None policy: Start" Mar 13 10:41:50.233751 master-0 kubenswrapper[17876]: I0313 10:41:50.233704 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7z94w_277614e8-838f-4773-bcfc-89f19c620dee/kube-multus-additional-cni-plugins/0.log" Mar 13 10:41:50.234003 master-0 kubenswrapper[17876]: I0313 10:41:50.233766 17876 generic.go:334] "Generic (PLEG): container finished" podID="277614e8-838f-4773-bcfc-89f19c620dee" containerID="e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" exitCode=137 Mar 13 10:41:50.241981 master-0 kubenswrapper[17876]: I0313 10:41:50.241017 17876 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 13 10:41:50.241981 master-0 kubenswrapper[17876]: I0313 10:41:50.241117 17876 state_mem.go:35] "Initializing new in-memory state store" Mar 13 10:41:50.241981 master-0 kubenswrapper[17876]: I0313 10:41:50.241415 17876 state_mem.go:75] "Updated machine memory state" Mar 13 10:41:50.241981 master-0 kubenswrapper[17876]: I0313 10:41:50.241428 17876 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 13 10:41:50.264526 master-0 kubenswrapper[17876]: I0313 10:41:50.264493 17876 manager.go:334] "Starting Device Plugin manager" Mar 13 10:41:50.264718 master-0 kubenswrapper[17876]: I0313 10:41:50.264561 17876 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 13 10:41:50.264718 master-0 kubenswrapper[17876]: I0313 10:41:50.264583 17876 server.go:79] "Starting device plugin registration server" Mar 13 10:41:50.265153 master-0 kubenswrapper[17876]: I0313 10:41:50.265137 17876 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 13 10:41:50.265217 master-0 kubenswrapper[17876]: I0313 10:41:50.265169 17876 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 13 10:41:50.269509 master-0 kubenswrapper[17876]: I0313 10:41:50.265452 17876 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 13 10:41:50.269509 master-0 kubenswrapper[17876]: I0313 10:41:50.265546 17876 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 13 10:41:50.269509 master-0 kubenswrapper[17876]: I0313 10:41:50.265553 17876 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 13 10:41:50.371647 master-0 kubenswrapper[17876]: I0313 10:41:50.368224 17876 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 13 10:41:50.378158 master-0 kubenswrapper[17876]: I0313 10:41:50.377397 17876 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 13 10:41:50.378158 master-0 kubenswrapper[17876]: I0313 10:41:50.377435 17876 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 13 10:41:50.378158 master-0 kubenswrapper[17876]: I0313 10:41:50.377444 17876 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 13 10:41:50.378158 master-0 kubenswrapper[17876]: I0313 10:41:50.377550 17876 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 13 10:41:50.574723 master-0 kubenswrapper[17876]: I0313 10:41:50.573668 17876 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 13 10:41:50.574723 master-0 kubenswrapper[17876]: I0313 10:41:50.573776 17876 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 13 10:41:53.797499 master-0 kubenswrapper[17876]: I0313 10:41:53.797361 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","kube-system/bootstrap-kube-controller-manager-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:41:53.798146 master-0 kubenswrapper[17876]: I0313 10:41:53.797993 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2","openshift-ingress-operator/ingress-operator-677db989d6-b2ss8","openshift-marketplace/redhat-marketplace-dnhzw","openshift-oauth-apiserver/apiserver-999d99f5f-hlk52","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z","openshift-etcd/installer-1-master-0","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh","openshift-network-diagnostics/network-check-target-jwfjl","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j","openshift-dns/node-resolver-d542b","openshift-machine-config-operator/machine-config-server-zkmjs","openshift-network-operator/network-operator-7c649bf6d4-z9wrg","openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl","openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c","openshift-dns-operator/dns-operator-589895fbb7-6zkqh","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9","openshift-kube-scheduler/installer-4-master-0","openshift-apiserver/apiserver-576d4447f8-zqphk","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt","openshift-machine-config-operator/machine-config-daemon-j9twr","openshift-multus/cni-sysctl-allowlist-ds-7z94w","openshift-service-ca/service-ca-84bfdbbb7f-xldln","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j","openshift-kube-scheduler/installer-4-retry-1-master-0","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5","openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg","openshift-dns/dns-default-qt95m","openshift-ingress/router-default-79f8cd6fdd-mbkch","openshift-insights/insights-operator-8f89dfddd-v9x5b","openshift-marketplace/marketplace-operator-64bf9778cb-4v99n","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl","openshift-marketplace/community-operators-lhqzl","openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc","openshift-multus/network-metrics-daemon-c5vhc","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7","openshift-kube-apiserver/kube-apiserver-master-0","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4","openshift-marketplace/redhat-operators-kqrsd","openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6","openshift-multus/multus-additional-cni-plugins-72t2n","openshift-multus/multus-bjv5r","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l","openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k","openshift-controller-manager/controller-manager-79847c4f97-tf57f","openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp","openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv","openshift-network-operator/iptables-alerter-55t7x","openshift-config-operator/openshift-config-operator-64488f9d78-pchtd","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-kube-scheduler/openshift-kube-scheduler-master-0","openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp","openshift-multus/multus-admission-controller-8d675b596-6gzxr","openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6","openshift-cluster-node-tuning-operator/tuned-mzx9f","openshift-kube-apiserver/installer-2-retry-1-master-0","openshift-kube-controller-manager/installer-2-master-0","openshift-marketplace/certified-operators-kwwkz","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk","openshift-kube-apiserver/installer-2-master-0","openshift-multus/multus-admission-controller-7769569c45-6lqz5","openshift-ovn-kubernetes/ovnkube-node-vww4t","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq","openshift-etcd/etcd-master-0","openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz","openshift-network-node-identity/network-node-identity-hkjrg","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7","assisted-installer/assisted-installer-controller-k96f8","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8"] Mar 13 10:41:53.799891 master-0 kubenswrapper[17876]: I0313 10:41:53.799845 17876 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="fc60183e-eaac-41ef-a9ef-6ba30d1fb673" Mar 13 10:41:53.803050 master-0 kubenswrapper[17876]: I0313 10:41:53.801192 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-k96f8" Mar 13 10:41:53.809159 master-0 kubenswrapper[17876]: I0313 10:41:53.808694 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 13 10:41:53.809159 master-0 kubenswrapper[17876]: I0313 10:41:53.808744 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 13 10:41:53.809459 master-0 kubenswrapper[17876]: I0313 10:41:53.809212 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 13 10:41:53.814518 master-0 kubenswrapper[17876]: I0313 10:41:53.809706 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 13 10:41:53.825618 master-0 kubenswrapper[17876]: I0313 10:41:53.824376 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:53.825618 master-0 kubenswrapper[17876]: I0313 10:41:53.824569 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 13 10:41:53.827530 master-0 kubenswrapper[17876]: I0313 10:41:53.827485 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.827753 master-0 kubenswrapper[17876]: I0313 10:41:53.827674 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:41:53.827753 master-0 kubenswrapper[17876]: I0313 10:41:53.827793 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:41:53.827904 master-0 kubenswrapper[17876]: I0313 10:41:53.827896 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:41:53.828566 master-0 kubenswrapper[17876]: I0313 10:41:53.828527 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.829438 master-0 kubenswrapper[17876]: I0313 10:41:53.829395 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.832062 master-0 kubenswrapper[17876]: I0313 10:41:53.832019 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.832219 master-0 kubenswrapper[17876]: I0313 10:41:53.832011 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"acbb43bf2cf27ed60d1f635fd6638ac7","Type":"ContainerStarted","Data":"32b612a83dd7d1068aeb085ed38090442b0aaa55e436fc815870541f19159b65"} Mar 13 10:41:53.832219 master-0 kubenswrapper[17876]: I0313 10:41:53.832221 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"5147fab22d0f195a0d02e6752ac479b6a2eb3fafa582ebefd6e564393cc0c1fe"} Mar 13 10:41:53.832219 master-0 kubenswrapper[17876]: I0313 10:41:53.832237 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:41:53.832882 master-0 kubenswrapper[17876]: E0313 10:41:53.832841 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:53.832882 master-0 kubenswrapper[17876]: I0313 10:41:53.832875 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.832898 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.832906 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.832925 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.832937 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.832951 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.832961 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.832978 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833014 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.833034 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833043 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.833056 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833063 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.833080 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833088 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.833179 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833189 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.832945 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: E0313 10:41:53.833207 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:53.833238 master-0 kubenswrapper[17876]: I0313 10:41:53.833255 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: E0313 10:41:53.833269 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833279 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833139 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833472 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833535 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="d04e4749-2b79-49e2-a451-a2733443a913" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833511 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833617 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833623 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833762 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="a7c07c6e-447f-4111-9d5a-b848fc3e1b2b" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833806 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833870 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833810 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833941 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="c834b554-c652-4f45-9110-3d4e260ba98a" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833961 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833978 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="994d29a3-98d8-45bd-8922-adcdc899b632" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.833995 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="2982c23c-b1dc-4090-9de1-a5c555ac6dad" containerName="assisted-installer-controller" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.834016 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.834030 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b488263-6a56-439c-945e-926936ed049d" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.834045 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="1e86a3b0-37b3-4df1-a522-f29cda076753" containerName="installer" Mar 13 10:41:53.834250 master-0 kubenswrapper[17876]: I0313 10:41:53.834220 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834331 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834472 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834507 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"572ea33856cce4705673f267dd6cdd4075e17161bfb7a4a9a4a7bdfe53ae4cca"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834544 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834603 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834616 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834546 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834735 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerDied","Data":"74ae020ca7669fb01b80f8f98f454493cc6cfee0df109ea9dc9a0bb83ef979da"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834809 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" event={"ID":"018c9219-d314-4408-ac39-93475d87eefb","Type":"ContainerStarted","Data":"6d81df6e0c2c501a006e6d355e7ca64b7f375686077a624175b4786dbf2e5138"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834824 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-k96f8" event={"ID":"2982c23c-b1dc-4090-9de1-a5c555ac6dad","Type":"ContainerDied","Data":"39e3998474ffa5421ada785b69659b745abc434915dc0302700b2f60923ba978"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834880 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-k96f8" event={"ID":"2982c23c-b1dc-4090-9de1-a5c555ac6dad","Type":"ContainerDied","Data":"30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834897 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30db7d5fe4993d804fc45a8ea268bf157254ba86d7efadf92a22d4e6eda05308" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834909 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerDied","Data":"a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4"} Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834966 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rb7nv"] Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.834929 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.835236 master-0 kubenswrapper[17876]: I0313 10:41:53.835067 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835123 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835374 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835402 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835502 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835516 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835619 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835637 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835725 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:41:53.835873 master-0 kubenswrapper[17876]: I0313 10:41:53.835808 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:41:53.836269 master-0 kubenswrapper[17876]: I0313 10:41:53.836031 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:41:53.836269 master-0 kubenswrapper[17876]: I0313 10:41:53.836204 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:41:53.836341 master-0 kubenswrapper[17876]: I0313 10:41:53.836325 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:41:53.836452 master-0 kubenswrapper[17876]: I0313 10:41:53.836430 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:41:53.836497 master-0 kubenswrapper[17876]: I0313 10:41:53.836479 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:53.836557 master-0 kubenswrapper[17876]: I0313 10:41:53.836535 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:41:53.836660 master-0 kubenswrapper[17876]: I0313 10:41:53.836637 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:41:53.836761 master-0 kubenswrapper[17876]: I0313 10:41:53.836739 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.836879 master-0 kubenswrapper[17876]: I0313 10:41:53.836854 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:41:53.836990 master-0 kubenswrapper[17876]: I0313 10:41:53.836963 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 10:41:53.837966 master-0 kubenswrapper[17876]: I0313 10:41:53.837088 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 10:41:53.838023 master-0 kubenswrapper[17876]: I0313 10:41:53.836324 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerStarted","Data":"2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3"} Mar 13 10:41:53.838081 master-0 kubenswrapper[17876]: I0313 10:41:53.838039 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rb7nv"] Mar 13 10:41:53.838081 master-0 kubenswrapper[17876]: I0313 10:41:53.838059 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerStarted","Data":"9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190"} Mar 13 10:41:53.838081 master-0 kubenswrapper[17876]: I0313 10:41:53.838079 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-p5ncj"] Mar 13 10:41:53.838220 master-0 kubenswrapper[17876]: E0313 10:41:53.837715 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.838220 master-0 kubenswrapper[17876]: E0313 10:41:53.837819 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:53.838299 master-0 kubenswrapper[17876]: I0313 10:41:53.837286 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:41:53.838369 master-0 kubenswrapper[17876]: I0313 10:41:53.838282 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:41:53.838369 master-0 kubenswrapper[17876]: E0313 10:41:53.837239 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.838447 master-0 kubenswrapper[17876]: I0313 10:41:53.837327 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:41:53.838447 master-0 kubenswrapper[17876]: I0313 10:41:53.837357 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:41:53.838531 master-0 kubenswrapper[17876]: I0313 10:41:53.837376 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:41:53.838531 master-0 kubenswrapper[17876]: E0313 10:41:53.837729 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"openshift-kube-scheduler-master-0\" already exists" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:53.838621 master-0 kubenswrapper[17876]: I0313 10:41:53.837381 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:41:53.838621 master-0 kubenswrapper[17876]: I0313 10:41:53.837414 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:41:53.838723 master-0 kubenswrapper[17876]: I0313 10:41:53.837420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:41:53.838723 master-0 kubenswrapper[17876]: I0313 10:41:53.837466 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:41:53.838794 master-0 kubenswrapper[17876]: I0313 10:41:53.837490 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.838829 master-0 kubenswrapper[17876]: I0313 10:41:53.837509 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:41:53.838829 master-0 kubenswrapper[17876]: I0313 10:41:53.837536 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.838902 master-0 kubenswrapper[17876]: I0313 10:41:53.838858 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5ncj"] Mar 13 10:41:53.838902 master-0 kubenswrapper[17876]: I0313 10:41:53.838881 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-trztz"] Mar 13 10:41:53.838993 master-0 kubenswrapper[17876]: I0313 10:41:53.837563 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:41:53.838993 master-0 kubenswrapper[17876]: I0313 10:41:53.838946 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:41:53.839078 master-0 kubenswrapper[17876]: I0313 10:41:53.837570 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:41:53.839078 master-0 kubenswrapper[17876]: I0313 10:41:53.839036 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:41:53.839195 master-0 kubenswrapper[17876]: I0313 10:41:53.839142 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:41:53.839239 master-0 kubenswrapper[17876]: I0313 10:41:53.837574 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:41:53.839287 master-0 kubenswrapper[17876]: E0313 10:41:53.837746 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:53.839328 master-0 kubenswrapper[17876]: I0313 10:41:53.837594 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:41:53.839374 master-0 kubenswrapper[17876]: I0313 10:41:53.839327 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:41:53.839374 master-0 kubenswrapper[17876]: I0313 10:41:53.839352 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:41:53.839374 master-0 kubenswrapper[17876]: I0313 10:41:53.837608 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.839494 master-0 kubenswrapper[17876]: I0313 10:41:53.837626 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.839494 master-0 kubenswrapper[17876]: I0313 10:41:53.839481 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.839529 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837631 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837651 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837653 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837674 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837715 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.837736 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:41:53.839573 master-0 kubenswrapper[17876]: I0313 10:41:53.839356 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837751 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839772 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837767 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837785 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837816 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837864 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.837881 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.838087 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839569 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839846 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839871 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839899 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:41:53.840045 master-0 kubenswrapper[17876]: I0313 10:41:53.839971 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:41:53.840522 master-0 kubenswrapper[17876]: I0313 10:41:53.839806 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:41:53.840522 master-0 kubenswrapper[17876]: I0313 10:41:53.840271 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:41:53.840522 master-0 kubenswrapper[17876]: I0313 10:41:53.840390 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:41:53.840522 master-0 kubenswrapper[17876]: I0313 10:41:53.840418 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:41:53.840522 master-0 kubenswrapper[17876]: I0313 10:41:53.840501 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:53.840823 master-0 kubenswrapper[17876]: I0313 10:41:53.840785 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:41:53.841209 master-0 kubenswrapper[17876]: I0313 10:41:53.841122 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:41:53.841512 master-0 kubenswrapper[17876]: I0313 10:41:53.841460 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6558455fc8-8qww9"] Mar 13 10:41:53.842057 master-0 kubenswrapper[17876]: I0313 10:41:53.842016 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" event={"ID":"25332da9-099c-4190-9e24-c19c86830a54","Type":"ContainerStarted","Data":"997999accb5a6bff6c2c6f0ce4bfa996a8b256c62954c05e165b3f90b0b8f80d"} Mar 13 10:41:53.842057 master-0 kubenswrapper[17876]: I0313 10:41:53.842051 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6558455fc8-8qww9"] Mar 13 10:41:53.842185 master-0 kubenswrapper[17876]: I0313 10:41:53.842127 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:53.842242 master-0 kubenswrapper[17876]: I0313 10:41:53.842191 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:53.842242 master-0 kubenswrapper[17876]: I0313 10:41:53.842189 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" event={"ID":"25332da9-099c-4190-9e24-c19c86830a54","Type":"ContainerStarted","Data":"136e725a814882d97a92b91f392b5a4bb1498352a85819c564006fc0555c46b2"} Mar 13 10:41:53.842242 master-0 kubenswrapper[17876]: I0313 10:41:53.842218 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" event={"ID":"2c3e94d4-5c6d-4092-975c-e5bca49eb397","Type":"ContainerStarted","Data":"6e7919b9ec2a19d38d0cdba955ac4202dc210129fdef5e7c637e62cb54c916e6"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842244 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" event={"ID":"2c3e94d4-5c6d-4092-975c-e5bca49eb397","Type":"ContainerStarted","Data":"b4158eeef011b1eba9a7b6d623266b582de3676d037792b146138f13d693513f"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842258 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerStarted","Data":"d028fc794a246b2460076d0dced5db6f65d2c7474177aae275ffc67970fe251d"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842271 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerStarted","Data":"7502f9cc62ba09fc484231576dec29370231e1a4a0ab25671b22dd093e569524"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842283 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerDied","Data":"4f57dbde7e6dd83a3f45d28b694622a3cd36e451a3d2e531b974cdf91eee3a45"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842298 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" event={"ID":"3b44838d-cfe0-42fe-9927-d0b5391eee81","Type":"ContainerDied","Data":"d4e74163544c10bf31d045c60068db268de2c869878f5f7b983afe24046cf63d"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842310 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4e74163544c10bf31d045c60068db268de2c869878f5f7b983afe24046cf63d" Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842321 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" event={"ID":"024d9bd3-ac77-4257-9808-7518f2a73e11","Type":"ContainerStarted","Data":"b9231178429930d79290e4d816cda8b0b95b77b22b615d27922f30211e7570b4"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842332 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" event={"ID":"024d9bd3-ac77-4257-9808-7518f2a73e11","Type":"ContainerStarted","Data":"390d92c6b1bf8de4d4ea48cb675d878d3b2cbd2b0311fc47e5e4feef80f55449"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842342 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"f8c529cacd73744ab46d9439b7345a1d27edc1e2d71b7933b404f4206bf30909"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842353 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"e0d3f2c007226936b12b661f6223b55c53b4c84c882223c0c75ca57b895fa28c"} Mar 13 10:41:53.842362 master-0 kubenswrapper[17876]: I0313 10:41:53.842363 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" event={"ID":"e7d31378-e940-4473-ab37-10f250c76666","Type":"ContainerStarted","Data":"e34fa9d84124b6c127298dbbcc66ee1981c2d493a18d9fee5da615255d116cb0"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842375 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"b6607de7f8444878291cce041e89b284e3fdfa07de1c40770b98ee1612cc8d65"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842387 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"4e35d9e42c3db125a61c2fa53787bbec93b1a84b0ca9bbb457199baa790d8533"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842400 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerDied","Data":"eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842412 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"646d9925ac7d679e5fe105dacc2e5ba2bf65b630c171bd0e095c89f902ecba0a"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842434 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerStarted","Data":"1c2d1da70477f9212bbcb8a5fb61059a816621fa16583f02d7521a05fdef2147"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842447 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerDied","Data":"4bdeab3ebfebb7845458ea9c29cbf7443ef96922911395dc3575274a6c5d9316"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842461 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" event={"ID":"a9258b0f-fdcc-4bfa-b982-5cf3c899c432","Type":"ContainerStarted","Data":"4a9a41f76fe188e7c2fc303922714d8a4a4540bbc426c47477e0dbcbe14a461c"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842472 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842484 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842495 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"3471f8b061f69364d6a6c8cab5125567cc698ffc7bd409e71de797b3e4919d0c"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842506 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842517 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842529 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842539 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842550 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842560 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerDied","Data":"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842573 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"4c3280e9367536f782caf8bdc07edb85","Type":"ContainerStarted","Data":"a08d83e357b2d0b2bdb74340c14200dc0261576386fc93ad944fe72db723fbff"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842585 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"7f44cac9d59c9752582d0c710ae74baa24a3adcc9cd398ea6e5fd9c8a59527e5"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842598 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerDied","Data":"a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842611 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"2e4a3a4a7895f019e0118f1584bc95eca1f9c60af18c9d3fe595f768be766c6d"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842626 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerDied","Data":"d19b978c1e8101a0212df3b6611d9d31aa1e8b34d80df670a9b5c7dd94abdbf2"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842640 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"1e86a3b0-37b3-4df1-a522-f29cda076753","Type":"ContainerDied","Data":"cc178eff65e9e37dfca64d7638a02200669b20cdded82a2b29fd98ec8a15cc9e"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842652 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cc178eff65e9e37dfca64d7638a02200669b20cdded82a2b29fd98ec8a15cc9e" Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842662 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"c14288f5668e235056cc67c66c8553579053cff3b8159a0ec2c339bf75712609"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842673 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerDied","Data":"84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842686 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"640ae6e09ed226b337075233b9303b1fb0d56099898746f5ff9f07d686060f2d"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842699 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"2fe7b69e87a4fa6425da976dffbe87c8c66862e1127867967d8f83ef262d49b7"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842711 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerDied","Data":"ccc3b2c6e99cb63369120234f78e03c40f7502629397be2489760d94a1bdc974"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842725 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"994d29a3-98d8-45bd-8922-adcdc899b632","Type":"ContainerDied","Data":"86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842737 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86e4c748dd805648a1520aba2bdf6a7b723dc2383a9f6375ee6ba4a4d8543cc8" Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842785 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerStarted","Data":"b5048988f4d14da58f4ecce60f1b0f53c921c94b9f30bb0d6da211a5c6a3196b"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842798 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerStarted","Data":"a4d11bdc39191c7e80e10de4111c03e816618edb2f6936bc80974dc84533f018"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842820 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"3063852d78f813c61c60f480671955bc61c573d347b6da50459bfe7f96b2e4ca"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842838 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"d11003d934637dd1f9b6e8d23feaca1fc18325edb8c1c59e1375d0720a4469cd"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842854 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"19fc005175f8b2f478ce604ebba0699b1705bf9617eadf91f124ce9c5926d18d"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842867 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjv5r" event={"ID":"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc","Type":"ContainerStarted","Data":"d2a94cae7314d31af0d86ce94f25ddeb94ed3dfcd2a8de1530f6be8d77df9d59"} Mar 13 10:41:53.842835 master-0 kubenswrapper[17876]: I0313 10:41:53.842880 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bjv5r" event={"ID":"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc","Type":"ContainerStarted","Data":"26fff2dc3e41e48ba0dc7d9f2053140bd93b347f3136b6ae79fe14dd5feaaf19"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842894 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842913 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"2374456736ebc7d72463b6654e06d916657c29a267fba9a956c950f521d8de03"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842925 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerStarted","Data":"77ae6dbbf39c4d2991c10b142e9d6fe23b3ada856897b7bc34aa3b7d69fa418b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842937 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" event={"ID":"a3a72b45-a705-4335-9c04-c952ec5d9975","Type":"ContainerStarted","Data":"7b8b432491b64c35241699cea9dca0847beab01faa6b11ea1ee81f0edac7188e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842951 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" event={"ID":"a3a72b45-a705-4335-9c04-c952ec5d9975","Type":"ContainerStarted","Data":"bb70bbe39b0a248a6aa4cef7e86697f7d917e3ba95ec678efc7f04cb53a9a7e7"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842965 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"d8db377380cc25a98f74177a2d972c0aadff0f1684a6e93080f90cae3a912f32"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842978 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"e669989fc04a3bbaaf8170906f7d49c5660764cd591eb569492010fe67858c9f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.842989 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" event={"ID":"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e","Type":"ContainerStarted","Data":"d3d43a9e0d6fcadcc6f108a3c9946899c22aed0cea6199f09212e71a1b6ab24d"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843002 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"2011f2a930c1149a0110b2744b7cf0ecd80491982b05c3fd36024d0672252582"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843019 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843032 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"136407fc6ee546951641a1123b4e37b22c08b30eef90bafae91497fd8eca613e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843045 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"57e72688ac44b6f412bc80bc5d4c7d9672ed6ce81db27dd8e0ee399b42f61ca3"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843057 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"7d64d717a487ab97526e634cae4313689073c2b2e0011a91b55f956bc40bfde9"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843070 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843083 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843115 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843129 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843142 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"804af0f197810492da9674aa46937e1801ae5a14e02f73596e29a002fe9774f2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843156 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"851fb998ee7d34cb6bb04d5f4061e13a565db5d18010b2516dd1dd436a846840"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843167 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" event={"ID":"a13f3e08-2b67-404f-8695-77aa17f92137","Type":"ContainerStarted","Data":"26320b73ca3fce1850dde3e75da5ccc58878b72f0f352ff1a9c176723a2b7d3d"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843188 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"1001e4a8a4042183edf1d1d087bc112421eacf94e38f3e35de4c5170d3dca5be"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843203 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"fd5397c516d4a5473893c96f67f300df25fb73d79280ce5bc95242d87f0224a1"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843217 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-j9twr" event={"ID":"0529b217-a9ef-48fb-b40a-b6789c640c20","Type":"ContainerStarted","Data":"9a9692d62aeb99fb7d4d3fc80637ffdf1ea3947790e26d640f42aacc16302c11"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843227 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerStarted","Data":"e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843237 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerStarted","Data":"2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843247 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" event={"ID":"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e","Type":"ContainerStarted","Data":"cf504ad2f3ecd51940abd8bf5bd673489b537e2883f503eb785901acbd1d1d46"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843257 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" event={"ID":"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e","Type":"ContainerStarted","Data":"af7a768842b9cbb587f10537824efb3089e2d3b4f70fb674c1d644bca3af49d7"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843267 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843277 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843288 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843298 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerDied","Data":"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843311 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"f25091eee8852eb2edb273c98fe0cda0a03827d71939b56576ffbabe005dcf83"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843322 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerStarted","Data":"f2f35061d66ce08b758ee386196ed6ff6b4759bf3ce064d800ee6dab38937e10"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843333 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerDied","Data":"799c00d706ab085bdece95573540241444c883e9ee37d48b06d60922afea2895"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843351 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerDied","Data":"18b792e5b93f77cf52a60082b53bff347c1fb4352f7afe19baba67d3e0c88848"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843362 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-kwwkz" event={"ID":"257ae542-4a06-42d3-b3e8-bf0a376494a8","Type":"ContainerStarted","Data":"8dcc826566dd71c1ba57235e348946dc0ebda9dd34a3e4858af9e5eff577f76f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843372 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zkmjs" event={"ID":"161beda5-f575-4e60-8baa-5262a4fe86c7","Type":"ContainerStarted","Data":"5f85cde1e59c38c70f96b8d80a5986cf96d25b188ad7c135912463c3cc69c6c8"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843382 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-zkmjs" event={"ID":"161beda5-f575-4e60-8baa-5262a4fe86c7","Type":"ContainerStarted","Data":"1d85f90b35c0a6fe94e4911c5e6e2a9798938c9acd1504a9008825c00646ea44"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843393 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b399c8bc734d16f4c258d0605a39203e9489484fa48d09e79fa8aa138647119c" Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843402 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843411 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"4db43ea419f7842a2dbe6e4e76fd533d04eb0ced70cb2513c77273e29bfa971d"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843421 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerDied","Data":"d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843432 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"51b866160e4a9eb352c0562a3f222378da0e7fac05a4589c8c137feb5a82511b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843442 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"ffbff762b6947c8a6cf71150184bbac8a221faecb2335c23291939c8a280ac89"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843451 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerDied","Data":"023d875133fdc4ce04cc7bc5bc0a4a73438cc8932d3e9561b68f3dbe9285c493"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843462 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" event={"ID":"53da2840-4a92-497a-a9d3-973583887147","Type":"ContainerStarted","Data":"aba1a9619c2284c0ac03b64f0ae7435f08f471030b575fc29fa6e377cf560350"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843473 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"443c931f2ebac98a3b89766ad47f2b9a07d8226240bc2a88a99655cd8cc10093"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843482 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"4a344875d4670ed9716f0cef98985188762c8daf81f4743d50027d07c28af916"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843491 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerDied","Data":"1619a1ce8609d442d9975720a8d6d707786b968509ed048f691e33fc7d117748"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843503 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" event={"ID":"cf740515-d70d-44b6-ac00-21143b5494d1","Type":"ContainerStarted","Data":"bcfacb71ae88d504692e95ad77d6c9b51c2d2697daec2bf687474302cc5abf90"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843512 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"eaaaf7cee366d112a249fdfea1e9161302183d264a0f34f87ad0c3717abfbc0b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843522 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerDied","Data":"6b4220f271a2b153bc0e77946705d348742d71cd7644e3f17d99cbdeff70f16f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843533 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerDied","Data":"9af6352032b6a53c8275f34292597e82151238d6d1e06b053ba0617d04ed63ea"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843543 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-kqrsd" event={"ID":"2157cb66-d458-4353-bc9c-ef761e61e5c5","Type":"ContainerStarted","Data":"e95e82ba3152944d5f266f4315ecef6f288f0249fcf6dd92d242f6cd35eb008a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843553 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843564 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843574 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"c5e876296b0a2729a3344c97bacebf2dce95059710f134fefa8e83abca942e51"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843584 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"09bada5ccab47e885c246b1faeb8678a7b3ac7c3284ff798a95c9eec287bbd00"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843593 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"8629ec87935b9c8163acca5e90c43ffc35598371cd514995496e1b481f1cd153"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843603 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"820c0c015259910a43a9d65233b6b59d3ff531e30b8ae70477184cc755d8b5d2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843613 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"9fb60bfa59d2ff40288f456815269ff4c838e82195edd334933c8654b4f8dedd"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843622 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"6ba9e9be1786a23e8c36df67db33e0578535dc45660f08e0b4a15c0971863075"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843631 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerDied","Data":"b633052bfd920e96b180e39e901d4b8b219bb35a62da570c5f41752fe4e617fe"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843643 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" event={"ID":"0932314b-ccf5-4be5-99f8-b99886392daa","Type":"ContainerStarted","Data":"362b488b60e500edad345a3bdb391d8633a2602bd4a4c722e98aafcb67a03251"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843654 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" event={"ID":"b5ed7aff-47c0-42f3-9a26-9385d2bde582","Type":"ContainerStarted","Data":"01cb1eb4cd1847633cd84937df690f5742decd6c5d3c9b634653c1fb9ee3bc43"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843668 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" event={"ID":"b5ed7aff-47c0-42f3-9a26-9385d2bde582","Type":"ContainerStarted","Data":"3c79fc8c488cef73422f2806765beba462d671a998503570dc8a76fca3916919"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843678 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerStarted","Data":"37af611a75465718656693a5e1606817c7f3876bc4578fbedfae2376aafb266a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843688 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerDied","Data":"58d7404b838e4c314c4bb71f4fca18a37f75d33d03431ce85b9c2b50d05d498a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843698 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerDied","Data":"b5d9c7e0055ba7e94e605d53781c97326170e75e394826099511e568c7ceef53"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843708 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-dnhzw" event={"ID":"f99b999c-4213-4d29-ab14-26c584e88445","Type":"ContainerStarted","Data":"2e7b5b751a85830176443ad561d2805b7b5b4c1ac49971eb3ef970b7e37cecd2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843717 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerStarted","Data":"7cfa3dc4e8621eea443d54a6af5854af2a55bace9bce4224ea7b93f4c1da9807"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843727 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"ce2b6ceda0b8c8212b1e35589d611accb6e40391c87b39cfb64f98a22b7e5dda"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843738 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"c08b2c581358381ac2f0c793ddf6295e272c0061c1b2d6e05d6e5ab7c2a5729b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843747 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"ffc23a177a087ad146cddc2bc253947b08886f41c707f8ee47efc6dd4d3c5c8e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843757 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"dc84ce423f666bcd523a540ff225040b69d4425d2faf8d523c79672591bd3375"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843767 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"c0bf4ee121253f4acc846c62a0fe4a189d6104b07034617c1152a5f95507935c"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843776 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerDied","Data":"8b657fe74504b246eb725ae59f9af4bc83c980e78da29e84184ef677c02cddbf"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843785 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-72t2n" event={"ID":"cc66541c-6410-4824-b173-53747069429e","Type":"ContainerStarted","Data":"ecfb809f461ed4b5e17c0262b316e339ce9305b6bc6bd651c9825d3462c45829"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843796 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" event={"ID":"17b956d3-c046-4f26-8be2-718c165a3acc","Type":"ContainerStarted","Data":"3fd3883c8b186f065fdd7d04082a866d1d3335a481da8ad2d7fb2179391a51ba"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843805 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" event={"ID":"17b956d3-c046-4f26-8be2-718c165a3acc","Type":"ContainerStarted","Data":"19f35bad4079f0b545148fd4db4666ab80db062f38092a6802b80cab4ec7982a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843855 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerStarted","Data":"fe42327b95dec5367f541c81b048f39545c2d05c4325d9527175937bbfdf24b4"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843868 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerDied","Data":"9c421d2fac6d7087c86a68ae07bf424407e762fa4149a323b0ac68e925b5c3b2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843899 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" event={"ID":"6e69683c-59c5-43da-b105-ef2efb2d0a4e","Type":"ContainerStarted","Data":"0731faf1ccc38c5ab120a7bbc1107b95b55d96e38e45782c9e5c1a73b27a4aa2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843915 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e87db7bbfe1b12ff9c4d6e51a7557b0b5b9f888224f2994eb06b3c08acb3aee0" Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843924 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"70ef4c5f1d692f58502f8e513680c34b7093d5497ebb044ab29ea9dcc18a1719"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843933 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"c1df94cdd30ef8f9eee6f877acb1f8a1552be4430e802f50d22a9879330a2fc9"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843941 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" event={"ID":"7748068f-7409-4972-81d2-84cfb52b7af0","Type":"ContainerStarted","Data":"907b8fe5b1745c9ca01da828cf8707b8c3a68c4c3ef14b623c7c1e97c76cec2a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843952 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"7ca6a95d1c17626751cc95fa8484019b6c82c228421de958d7300514a2ca3f13"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843961 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"de680a22776cd5fe71b4b6d498091c7d353a1cf41ab4b46ddcaa37a48ad3bc06"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843969 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" event={"ID":"2563ecb2-5783-4c45-a7f6-180e14e1c8c4","Type":"ContainerStarted","Data":"b6406db9242e3599a9f6b43c6cc7f931a2398c12649757d5a331d9757d32028e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843979 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jwfjl" event={"ID":"a7b698d2-f23a-4404-bc63-757ca549356f","Type":"ContainerStarted","Data":"dbde788ea183ad05575d070f12031405131e50e2eb12fce79b8429c063439949"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843989 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-jwfjl" event={"ID":"a7b698d2-f23a-4404-bc63-757ca549356f","Type":"ContainerStarted","Data":"eeb72465bb1427cd72d3fec6562ba06ea7643d9bdc5ec1fb0376fb8a56a95ac9"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.843998 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"ad75c939343bfb30bc5319b14b8035776ee4b1b3343e77f1374907643eae75c7"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844007 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"3671ca168b59df1b45e12ef956adf5651789dcef52410877c19c3c2f33c47060"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844017 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"3f70a6e48f4961d3f3aa9bd2ea9a0d93f3b6d1cb80845a1b38a9f457c4a26858"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844028 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="06fa16b6f429d4eead3bc6c77c9dd34958237b3dcfdcf9e1ccdd2d0cbc03965f" Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844036 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerStarted","Data":"f8ee70c9fa0ff679b2fe8d381e882bf591af1eea80403c4097b9987e2d06b36d"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844045 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerDied","Data":"c39379a7ceff230ca12a3c25b2f95b4de4ef093f144e78b137c6626ee9d2fcfb"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844056 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" event={"ID":"97328e01-1227-417e-9af7-6426495d96db","Type":"ContainerStarted","Data":"4f70e184622d577e74124d1d17bc445ea80514437cbc221bcb9f2c6f012aa2ca"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844068 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"406d6e11697cacd57dcd99d84785c736a52ac48c6ef5c27b81e728ae6e2f38f1"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844078 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerDied","Data":"77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844087 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"b090a7b841b2284b4a367b1fe9eb531751b92400aca909b51b87e9d7691a206c"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844111 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f1b14a227c8bc8b981f29cfb4546b0b823b1f503f3e6f7c9e6a036205e1e83ce" Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844120 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"4306aa93623283fa1e756de36acf9fe639a1c8b92b5741ac2b1dc315689b3cc6"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844128 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerDied","Data":"d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844141 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"8ab405ef0e7b542476b55860f034ef7404421d6b9bac08317c0aa8791073c002"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844149 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"45e97c8be6a0792cbf5d1476a7f96b024d4d2f79219317d9d80b590652a61ee5"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844158 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerStarted","Data":"215427705b781af8c7a6f0bf3e652f4e47b2031bd0151f282db01d4307872ee6"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844167 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerDied","Data":"e4267b4b9b6b191ff966b31bd837f533d3228034c0ef80179d1995e5cb7ea50e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844178 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" event={"ID":"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf","Type":"ContainerStarted","Data":"3570848357e5506974fe0bf7403febd141c42df26480ee23abd1ee4bc5538372"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844187 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"8d2502ddf45dc60246cfc038c25340d355c40feb7ef15264d33e1c93664efbd3"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844196 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerDied","Data":"6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844205 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"5759216ebfee850b79609783445de8124c370c8bac5b63e2b5f03e38c742e1f0"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844222 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" event={"ID":"258f571e-5ec8-42df-b4ba-17457d87d10d","Type":"ContainerStarted","Data":"ad7266ae43d7f039a194144e05ebf1043632f55b79736d2da49f78da98fb730b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844232 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" event={"ID":"258f571e-5ec8-42df-b4ba-17457d87d10d","Type":"ContainerStarted","Data":"c49cb5ec4e7e39a0508963b675cac957ba726b0560cc0f79f6aa2da35216dcaa"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844241 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerStarted","Data":"c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844250 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerStarted","Data":"7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844259 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerStarted","Data":"cc39dd97fa33d7186bc0c795b8d5e196c978cac3bdc2c8d9dbf7380009448266"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844268 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerStarted","Data":"e91ae8a44c4b4ac29324f7dfadcc336d6d0480a0d6149be7ceb4f9d9b967f1b2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844278 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"b87c048ad8f6b66600aef035430a3c74694d425a7990645314c96636905e37f6"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844287 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerDied","Data":"ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844298 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"b824573f6b95b2e21d36b9d4c1faa0ee0ac02b8c48ac4481752faf216bc6b459"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844306 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"7653351a80744624f96bb693379607a8ee7ec36896c7128ff03ffe2db44fbdb0"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844315 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"706a8d0e60c2f5ca912ef3877380449fef368655b29cf505668eb09b4233133e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844324 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"82c2160bbc4014a38023fe88cc8ab1055a69a4c32765b0ad1ae3def9ef497d37"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844332 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" event={"ID":"b02805e2-f186-4e59-bdfa-f4793263b468","Type":"ContainerStarted","Data":"8988806dc69dce5b61c53cc2845447a33f520244d709f93fdb6f76499aee8916"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844342 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerDied","Data":"bda6d571a69475cffe984e819a7cc51ddb710348cfb7bd2636c19986e3e1d5ca"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844352 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c834b554-c652-4f45-9110-3d4e260ba98a","Type":"ContainerDied","Data":"14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844359 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="14e4a5b96082336e956c460f6dbbd6950d248ffe902ecbce373e7f4ab4b93495" Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844367 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" event={"ID":"d9fd7b06-d61d-47c3-a08f-846245c79cc9","Type":"ContainerStarted","Data":"d551846b834f3c792666af696b3893004dc6412c55eeea4b5cdb805b2eaffa1b"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844376 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" event={"ID":"d9fd7b06-d61d-47c3-a08f-846245c79cc9","Type":"ContainerStarted","Data":"c918fb3b270e41c6d62b6e571b5882afaab66a46ce66ce229de4e70f9853f259"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844385 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" event={"ID":"03b97fde-467c-46f0-95f9-9c3820b4d790","Type":"ContainerStarted","Data":"466ebfbc9d939fb59cc09aed7d0174adb466a23bb438e923666b0bfead02089f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844396 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" event={"ID":"03b97fde-467c-46f0-95f9-9c3820b4d790","Type":"ContainerStarted","Data":"64cac6ba3a561adbc8f8770dc2f28e49933388f06613c25151f7bbd0ceb39107"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844406 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8d5872d3df5ae3d0356feb1227762765a592eb87fd4344b9e636b3a3e963fad0"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844417 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"2e52564d1775b46a6445744719c4c3157c46ddb2f615bf82c3d17e00c27324c3"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844427 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"e61aa885acf4e08508a6ce338221d6e4395ca3102a9b91ced2db728621c8a1d6"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844436 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a0d14a0b43734a571ff869b2d64db9d6e51ff5a9e4e7f399600737454cb213f4"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844446 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"12d7699651508d757bad08ce9a02fbaf1b9a7210ca40bc453b12412cce05999a"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844456 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"b449d051473ff9974acc080b10607f0bdeb8e4b0dbbbfc4c1bde4f8d09a30cfb"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844465 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"6b457fca38abf31ca20d44610b680f150e7060cd35d43f544ed341cc62e726d2"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844475 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"fe76c4da023ee8241529e5f2a6a092dc48a1a51d30db462a00bc458437ba96ee"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844485 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"f9193ce0cecc29a04837d4cc5243527b46397232b9255d51f28db25efcba2a5f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844498 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"48bcb8e3556650cfae3adfa0ba5f6b7611552bc7f1f0e3120408fbfc9691ca6f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844508 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"0319f8f80eb171f822ffe8f69fbb1f9a58cf580e706e87e65dd082195bf305e5"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844517 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"d7182bbcf3b04cf73af9cfa3d474e42048ccc1adcf54c50ae6cfbada1d1719cb"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844529 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"17b0b898af4d319cde841f52d81826f891aba80f9ced795a7749caba01d53d9e"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844541 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c5vhc" event={"ID":"8df2728b-4f21-4aef-b31f-4197bbcd2728","Type":"ContainerStarted","Data":"77b4f8a8bc891942c93fc6bc58a70209e4d2685ce12294e206b71662186490b9"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844556 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" event={"ID":"b460735c-56aa-4dd3-a756-759859083e12","Type":"ContainerStarted","Data":"4605ba5397add60b5787249493f28afddf74b60e5e4cff2c37fbf2c850052e1f"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844569 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" event={"ID":"b460735c-56aa-4dd3-a756-759859083e12","Type":"ContainerStarted","Data":"0914c9dfe834a278f9e1d4681bb723905c3e5989f516b46f2ba1193d83eed513"} Mar 13 10:41:53.844379 master-0 kubenswrapper[17876]: I0313 10:41:53.844581 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844593 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844605 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844616 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844625 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844636 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844647 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844656 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844664 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"6c3bc64f22f8c58f9e978db84c7754f9ee2b132931d3190f29d081554cf105af"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844673 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"b650c74f8b57c73e892b63268846a8c6d8dd851805ffc652eb497ec8ad4cfef2"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844683 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"0f5ebe341252592ddb7ea07f27157b630ec6d6698481394ac477334f68310522"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844692 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"63231614ec22d4454ef35c7c5f658a2bf8feeb9e3992ebcd24e450ba7c030a73"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844702 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"3e57f0c206254ea3660894f2b43e0962df459b11648d6e3f38e8d9b4b235affb"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844711 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"faac5e329585d430afa85413196ec70d876c8e306f516444729861dbf4543445"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844720 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"fe506e16bf970e0974dba16aaea6afa314d9c57d8900b0995c3107b8a4cb3261"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844729 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"06939486a208e579a527e3fd963447fe09a8c883f53d1384dddfe29aa63e21b3"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844737 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"eb29cfca46e834bca84bd552c0afc75c789697afff5f022856c3a37252a97f97"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844747 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerDied","Data":"f741ec84eccfaea3008e82066654cae2f174abb120ece50ffb0345c3a6b62422"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844758 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" event={"ID":"fb060653-0d4b-4759-a7a1-c5dce194cce7","Type":"ContainerStarted","Data":"0e036cb949ad53abaeffbee83069e6acb0577ebaebdc915671dcc6e625c1d2d1"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844767 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerStarted","Data":"54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844776 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerDied","Data":"34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844786 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerStarted","Data":"5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844795 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerStarted","Data":"6bccc03f527d31faff90a6a48a17616821689e95166564e5cd0c7e71c9851946"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844804 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerDied","Data":"acffbeb48d69148ddc4c8917c5bd669fe4ed2976ba6b612592b2abc4fff01c7e"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844815 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" event={"ID":"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906","Type":"ContainerStarted","Data":"e3e74e8a6d87769b2b8f6bdae5a948fbb44f464be31e39d10a8d9e290f6b63c1"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844825 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-55t7x" event={"ID":"58685de6-b4ae-4229-870b-5143a6010450","Type":"ContainerStarted","Data":"31df9233b4a5d4d57a39c81be8f4431504aae76b625128b5139003e68085c9bf"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844833 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-55t7x" event={"ID":"58685de6-b4ae-4229-870b-5143a6010450","Type":"ContainerStarted","Data":"bb3ca46b59b0129ad5727483a11511be5f137b040615767af5315ad6197275c5"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844842 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerDied","Data":"00a4f5e044b3bb37309a0058cc340985271f0a9be303d372e70635d4947090aa"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844854 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"a7c07c6e-447f-4111-9d5a-b848fc3e1b2b","Type":"ContainerDied","Data":"1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844862 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b631c8937934ac5b9ab90895b5a85362140ac33954ca78bfb346da5d4eb1406" Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844871 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"f92b7dcf30e2a83f947525493e88745aa9417da1536fbf60b66ed4a133a0e4a5"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844881 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerDied","Data":"1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844891 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"a0402f1c5a13e15611c8f63c3d9aee464f9ad7b4027e6b733af8eb3a802f622a"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844900 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"a37231e5cc55c1e76147d48ef0838775a990a3f28298bc163b9c8540136b0b87"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844911 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerDied","Data":"2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844920 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"da062cae7ba30721cdab3fbeaf191a4effb6155035008cb1f6db9debdbeee327"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844929 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerStarted","Data":"292bc64fae325e305791874ac3c6df238e90679ca812b4e7ab3bdd42cad6e68f"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.844983 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerDied","Data":"fc95bff32f2114b905d9fbe18892b7b039189a377e939c5fcb424714913dd15f"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845017 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerDied","Data":"053d1c527d639c6703a290ef72056a864dded275336f60631cf170ecafc6976b"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845062 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-lhqzl" event={"ID":"8b07c5ae-1149-4031-bd92-6df4331e586c","Type":"ContainerStarted","Data":"edb84f3680f6b7a9122dea49c8ac75c4b3614e7e24eb119b118fbf82de0d5e2c"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845071 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"e541c073a97e968aa996efa485f9023f303d33477bd12a38bf45fb29e057d0dc"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845080 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerDied","Data":"32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845106 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"7d8988c40bcb4c1b05a397c81e2d096db0d22c32db0303c2deb1b424d97a407e"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845116 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"c7526d564e3a6f102aadf838e6bbb178d8da329a07b6933f64af4c716253d4e9"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845125 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"e67aecaee25884cdb16b8c22b14a8ace233f4db4719fca42142020ecb4c32d8c"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845134 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" event={"ID":"fd91626c-38a8-462f-8bc0-96d57532de87","Type":"ContainerStarted","Data":"1ef234f61cea7c4557ed7630ebc1fc035e35f0ac9ec489d52978e9ee92ee0a9d"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845142 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d542b" event={"ID":"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c","Type":"ContainerStarted","Data":"9e9770f157a4ec6cd726bd326d6c98845c1b4b7c517bb15dcbd0850a4395d902"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845151 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-d542b" event={"ID":"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c","Type":"ContainerStarted","Data":"2659c5a6a41b8bd57f0bf3c1da691ca647e461b974a89f7c9f8fe2c464e9654a"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845160 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31634e1fa2a526a5eef76adce598a8e242bdd09cd3c5df9b79281ebf5788e31f" Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845169 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerDied","Data":"cfc30e3ed734f4cb74033d3d0ab50e918052fd74c62e5f4931d21fcdfbcbd074"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845179 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"6b488263-6a56-439c-945e-926936ed049d","Type":"ContainerDied","Data":"f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845187 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f549932c0ebdea379e4d4be2975aff15ee1750a3cde3baee822ab2e357eb0f7a" Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845195 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerStarted","Data":"02539d7838ebb483ffcca293d983b439f593e30b5eaf03def36de01bbe1607e5"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845204 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerStarted","Data":"83984d61bee36a62e18f8d890427add9cd46f3fdf35427d35282826b077e6300"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845213 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerDied","Data":"2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3"} Mar 13 10:41:53.850812 master-0 kubenswrapper[17876]: I0313 10:41:53.845224 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerDied","Data":"e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb"} Mar 13 10:41:53.853771 master-0 kubenswrapper[17876]: I0313 10:41:53.853642 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:41:53.856717 master-0 kubenswrapper[17876]: I0313 10:41:53.856654 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:41:53.859515 master-0 kubenswrapper[17876]: I0313 10:41:53.857945 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:41:53.865560 master-0 kubenswrapper[17876]: I0313 10:41:53.864815 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 10:41:53.866182 master-0 kubenswrapper[17876]: I0313 10:41:53.866115 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:41:53.866493 master-0 kubenswrapper[17876]: I0313 10:41:53.866457 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:41:53.866564 master-0 kubenswrapper[17876]: I0313 10:41:53.866470 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:41:53.879151 master-0 kubenswrapper[17876]: I0313 10:41:53.877636 17876 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 13 10:41:53.883662 master-0 kubenswrapper[17876]: I0313 10:41:53.883610 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:41:53.890150 master-0 kubenswrapper[17876]: I0313 10:41:53.890083 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7z94w_277614e8-838f-4773-bcfc-89f19c620dee/kube-multus-additional-cni-plugins/0.log" Mar 13 10:41:53.890367 master-0 kubenswrapper[17876]: I0313 10:41:53.890188 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.903079 master-0 kubenswrapper[17876]: I0313 10:41:53.903025 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:41:53.924418 master-0 kubenswrapper[17876]: I0313 10:41:53.924368 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930151 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930200 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930227 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930255 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5gkd\" (UniqueName: \"kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930279 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930301 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930341 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930357 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fswp7\" (UniqueName: \"kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930409 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930429 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930446 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vbsc\" (UniqueName: \"kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930466 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jk4qr\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930483 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mnnnp\" (UniqueName: \"kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930501 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmf6l\" (UniqueName: \"kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930518 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nr6p\" (UniqueName: \"kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930534 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930554 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930572 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930596 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930618 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930639 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930665 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-85vcp\" (UID: \"258f571e-5ec8-42df-b4ba-17457d87d10d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930686 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8z5fj\" (UniqueName: \"kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930712 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930743 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930764 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930791 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930815 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930839 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930856 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930872 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930894 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930917 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930932 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930949 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930964 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930980 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6lnq\" (UniqueName: \"kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.930995 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931010 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931027 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931042 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931076 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931117 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931140 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931164 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931181 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931198 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931215 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931237 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p29b\" (UniqueName: \"kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931259 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931283 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931299 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931317 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931338 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931360 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931416 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931489 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931528 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931553 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931576 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931601 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931627 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931643 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931660 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931683 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931699 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931721 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4kn26\" (UniqueName: \"kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931751 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931771 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931788 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931807 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931831 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931851 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931868 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931887 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931910 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931933 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931957 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rqms\" (UniqueName: \"kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931974 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.931990 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932005 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932021 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932040 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932055 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932073 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932088 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932148 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932183 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932206 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932227 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932245 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932261 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr5lp\" (UniqueName: \"kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932278 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932294 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:53.932209 master-0 kubenswrapper[17876]: I0313 10:41:53.932310 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932327 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gntlk\" (UniqueName: \"kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932344 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932370 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932390 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8dpd\" (UniqueName: \"kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932406 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932424 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932441 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932458 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932473 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932489 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932505 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932520 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932537 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932554 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932570 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932587 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932604 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932621 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932638 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932656 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nqrh5\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932673 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932688 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932713 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932729 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932746 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5v4b\" (UniqueName: \"kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b\") pod \"csi-snapshot-controller-7577d6f48-kcw4k\" (UID: \"84f78350-e85c-4377-97cd-9e9a1b2ff4ee\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932763 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932778 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932794 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932809 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932824 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932840 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932857 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932873 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932890 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932907 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932931 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932956 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932980 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvl4j\" (UniqueName: \"kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.932998 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933014 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fcqg\" (UniqueName: \"kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933034 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933084 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933124 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933152 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933178 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933196 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933211 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933230 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933271 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933297 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933321 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9cxp\" (UniqueName: \"kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933360 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933384 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933409 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933431 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933451 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933469 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933485 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933503 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933521 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933539 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933558 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933575 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933593 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933610 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933627 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933651 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933674 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933693 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933710 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933729 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933756 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933779 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933797 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933813 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933834 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933860 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933884 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933900 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933917 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933935 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjkdx\" (UniqueName: \"kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933952 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933968 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.933986 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934002 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934020 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934037 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934053 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934071 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934108 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934126 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934143 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934162 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934188 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934211 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr9x5\" (UniqueName: \"kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5\") pod \"network-check-source-7c67b67d47-zxjfv\" (UID: \"b460735c-56aa-4dd3-a756-759859083e12\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934233 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934264 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934291 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934315 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934334 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934352 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrmcp\" (UniqueName: \"kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934375 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934402 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934427 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mjm7\" (UniqueName: \"kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7\") pod \"migrator-57ccdf9b5-k9n8l\" (UID: \"fd91626c-38a8-462f-8bc0-96d57532de87\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934447 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934472 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934500 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934526 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934546 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934572 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934596 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934625 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934647 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934675 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934702 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934784 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934808 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23e2957-3a22-44f6-937c-5ab6314681c0-host\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.934834 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935109 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-utilities\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935335 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-env-overrides\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935592 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-snapshots\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935615 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/17b956d3-c046-4f26-8be2-718c165a3acc-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935775 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-config\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.935935 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/3f872e59-1de1-4a95-8064-79696c73e8ab-available-featuregates\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.936261 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8b07c5ae-1149-4031-bd92-6df4331e586c-catalog-content\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:53.936165 master-0 kubenswrapper[17876]: I0313 10:41:53.936284 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-utilities\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936404 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f872e59-1de1-4a95-8064-79696c73e8ab-serving-cert\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936486 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-binary-copy\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936629 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/8df2728b-4f21-4aef-b31f-4197bbcd2728-metrics-certs\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936698 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/a13f3e08-2b67-404f-8695-77aa17f92137-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936704 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-tmp\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936768 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936808 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936835 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/ec33c506-8abe-4659-84d3-a294c31b446c-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936837 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936880 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws7gk\" (UniqueName: \"kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936913 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936938 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936939 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/257ae542-4a06-42d3-b3e8-bf0a376494a8-catalog-content\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936961 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936976 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e7d31378-e940-4473-ab37-10f250c76666-metrics-tls\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.936990 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert\") pod \"monitoring-plugin-6558455fc8-8qww9\" (UID: \"5c65aadf-c6fc-4959-9366-3e9d378bb507\") " pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937017 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937036 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937319 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-config\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937577 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/03b97fde-467c-46f0-95f9-9c3820b4d790-srv-cert\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937862 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8d2fdba3-9478-4165-9207-d01483625607-metrics-tls\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.937906 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/a3c91eef-ec46-419f-b418-ac3a8094b77d-ovnkube-identity-cm\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938266 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/25332da9-099c-4190-9e24-c19c86830a54-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938343 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/db9faadf-74e9-4a7f-b3a6-902dd14ac978-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938410 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-env-overrides\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938659 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938694 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d9fd7b06-d61d-47c3-a08f-846245c79cc9-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938756 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938792 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938824 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938841 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-serving-cert\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938852 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938870 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938884 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htb49\" (UniqueName: \"kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938952 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938988 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.938993 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-utilities\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939018 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939049 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939074 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/cf740515-d70d-44b6-ac00-21143b5494d1-metrics-tls\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939075 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939183 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939259 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/893dac15-d6d4-4a1f-988c-59aaf9e63334-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939380 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-operand-assets\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939420 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939444 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ecb5bdcc-647d-4292-a33d-dc3df331c206-serving-cert\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939553 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-utilities\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939576 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: E0313 10:41:53.939617 17876 configmap.go:193] Couldn't get configMap openshift-multus/cni-sysctl-allowlist: object "openshift-multus"/"cni-sysctl-allowlist" not registered Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: E0313 10:41:53.939700 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist podName:277614e8-838f-4773-bcfc-89f19c620dee nodeName:}" failed. No retries permitted until 2026-03-13 10:41:54.439674157 +0000 UTC m=+22.275480633 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cni-sysctl-allowlist" (UniqueName: "kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist") pod "cni-sysctl-allowlist-ds-7z94w" (UID: "277614e8-838f-4773-bcfc-89f19c620dee") : object "openshift-multus"/"cni-sysctl-allowlist" not registered Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939787 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/cf740515-d70d-44b6-ac00-21143b5494d1-trusted-ca\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939811 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/1ef32245-c238-43c6-a57a-a5ac95aff1f7-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940052 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/25332da9-099c-4190-9e24-c19c86830a54-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940193 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-daemon-config\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940240 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-tuned\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.939678 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940404 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hfjxf\" (UniqueName: \"kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940449 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940458 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/97328e01-1227-417e-9af7-6426495d96db-tmpfs\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940472 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940483 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940521 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940550 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940579 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940609 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940637 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940671 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940701 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940706 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-config\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940723 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovn-node-metrics-cert\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940731 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940778 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:53.940706 master-0 kubenswrapper[17876]: I0313 10:41:53.940792 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ba3e43ba-2840-4612-a370-87ad3c5a382a-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940813 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940842 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940877 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940905 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940928 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940936 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53da2840-4a92-497a-a9d3-973583887147-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.940949 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941014 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941051 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941053 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fb060653-0d4b-4759-a7a1-c5dce194cce7-ovnkube-script-lib\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941448 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1f358d81-87c6-40bf-89e8-5681429285f8-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941497 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6hq5\" (UniqueName: \"kubernetes.io/projected/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-kube-api-access-t6hq5\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941633 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941697 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941726 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941755 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941804 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941839 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941903 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.941993 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942101 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/024d9bd3-ac77-4257-9808-7518f2a73e11-srv-cert\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942092 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942148 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942183 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942218 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942246 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942274 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942298 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/893dac15-d6d4-4a1f-988c-59aaf9e63334-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942306 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942337 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942362 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942366 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e69683c-59c5-43da-b105-ef2efb2d0a4e-config\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942584 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-cabundle\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942664 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942687 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942708 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942732 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bn7vq\" (UniqueName: \"kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942752 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942778 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942804 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d9fd7b06-d61d-47c3-a08f-846245c79cc9-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942808 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942845 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942870 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942880 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1f358d81-87c6-40bf-89e8-5681429285f8-config\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942887 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942932 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942941 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtq5b\" (UniqueName: \"kubernetes.io/projected/c179be5b-2517-4ae5-9c30-2d4415899123-kube-api-access-jtq5b\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.942998 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ba3e43ba-2840-4612-a370-87ad3c5a382a-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943005 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943076 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/7c5279e3-0165-4347-bfc7-87b80accaab3-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943089 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943119 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f99b999c-4213-4d29-ab14-26c584e88445-catalog-content\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943128 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943150 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943171 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943191 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943209 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943229 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943249 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943269 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943290 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943289 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-config\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943306 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-serving-cert\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943311 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6smf\" (UniqueName: \"kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943427 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/17b956d3-c046-4f26-8be2-718c165a3acc-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943456 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943481 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943524 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/53da2840-4a92-497a-a9d3-973583887147-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943549 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943574 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943593 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943597 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cni-binary-copy\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943612 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943643 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943663 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943684 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943687 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/a3c91eef-ec46-419f-b418-ac3a8094b77d-webhook-cert\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943705 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943780 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/0932314b-ccf5-4be5-99f8-b99886392daa-etcd-client\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943849 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943866 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943905 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2157cb66-d458-4353-bc9c-ef761e61e5c5-catalog-content\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943908 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.943987 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944001 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ecb5bdcc-647d-4292-a33d-dc3df331c206-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944011 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944041 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m5x2b\" (UniqueName: \"kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944062 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944150 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffmmr\" (UniqueName: \"kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944234 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944267 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944329 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944361 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944390 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944417 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944365 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/193b3b95-f9a3-4272-853b-86366ce348a2-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944479 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944508 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944599 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/193b3b95-f9a3-4272-853b-86366ce348a2-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944474 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e69683c-59c5-43da-b105-ef2efb2d0a4e-serving-cert\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944650 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944763 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944817 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.944857 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chxxr\" (UniqueName: \"kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.945089 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gcf6\" (UniqueName: \"kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:53.945207 master-0 kubenswrapper[17876]: I0313 10:41:53.945237 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppjzw\" (UniqueName: \"kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945359 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfxbl\" (UniqueName: \"kubernetes.io/projected/d23e2957-3a22-44f6-937c-5ab6314681c0-kube-api-access-mfxbl\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945432 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945483 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945531 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945588 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945663 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945804 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945907 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945953 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.945991 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.946055 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0932314b-ccf5-4be5-99f8-b99886392daa-config\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.946303 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2c3e94d4-5c6d-4092-975c-e5bca49eb397-signing-key\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:53.948549 master-0 kubenswrapper[17876]: I0313 10:41:53.946463 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:41:53.962115 master-0 kubenswrapper[17876]: I0313 10:41:53.962044 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:41:53.983422 master-0 kubenswrapper[17876]: I0313 10:41:53.983365 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:41:53.988286 master-0 kubenswrapper[17876]: I0313 10:41:53.988240 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/cc66541c-6410-4824-b173-53747069429e-whereabouts-configmap\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.010606 master-0 kubenswrapper[17876]: E0313 10:41:54.010372 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.023633 master-0 kubenswrapper[17876]: I0313 10:41:54.023575 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:41:54.028634 master-0 kubenswrapper[17876]: I0313 10:41:54.028584 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/58685de6-b4ae-4229-870b-5143a6010450-iptables-alerter-script\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:54.043206 master-0 kubenswrapper[17876]: I0313 10:41:54.043150 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 10:41:54.047360 master-0 kubenswrapper[17876]: I0313 10:41:54.047321 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") pod \"277614e8-838f-4773-bcfc-89f19c620dee\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " Mar 13 10:41:54.047463 master-0 kubenswrapper[17876]: I0313 10:41:54.047447 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") pod \"277614e8-838f-4773-bcfc-89f19c620dee\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " Mar 13 10:41:54.047783 master-0 kubenswrapper[17876]: I0313 10:41:54.047704 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.047783 master-0 kubenswrapper[17876]: I0313 10:41:54.047736 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.047875 master-0 kubenswrapper[17876]: I0313 10:41:54.047825 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.047875 master-0 kubenswrapper[17876]: I0313 10:41:54.047869 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.047950 master-0 kubenswrapper[17876]: I0313 10:41:54.047864 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "277614e8-838f-4773-bcfc-89f19c620dee" (UID: "277614e8-838f-4773-bcfc-89f19c620dee"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:41:54.047988 master-0 kubenswrapper[17876]: I0313 10:41:54.047964 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-netns\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.048025 master-0 kubenswrapper[17876]: I0313 10:41:54.047986 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-sys\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.048062 master-0 kubenswrapper[17876]: I0313 10:41:54.048031 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.048062 master-0 kubenswrapper[17876]: I0313 10:41:54.048053 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048181 master-0 kubenswrapper[17876]: I0313 10:41:54.048072 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.048181 master-0 kubenswrapper[17876]: I0313 10:41:54.048116 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.048265 master-0 kubenswrapper[17876]: I0313 10:41:54.048157 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready" (OuterVolumeSpecName: "ready") pod "277614e8-838f-4773-bcfc-89f19c620dee" (UID: "277614e8-838f-4773-bcfc-89f19c620dee"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:41:54.048265 master-0 kubenswrapper[17876]: I0313 10:41:54.048193 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-k8s-cni-cncf-io\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.048361 master-0 kubenswrapper[17876]: I0313 10:41:54.048320 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048403 master-0 kubenswrapper[17876]: I0313 10:41:54.048356 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-cnibin\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.048403 master-0 kubenswrapper[17876]: I0313 10:41:54.048361 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:54.048473 master-0 kubenswrapper[17876]: I0313 10:41:54.048407 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:54.048473 master-0 kubenswrapper[17876]: I0313 10:41:54.048401 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.048473 master-0 kubenswrapper[17876]: I0313 10:41:54.048433 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.048565 master-0 kubenswrapper[17876]: I0313 10:41:54.048477 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-slash\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048565 master-0 kubenswrapper[17876]: I0313 10:41:54.048541 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.048619 master-0 kubenswrapper[17876]: I0313 10:41:54.048575 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048619 master-0 kubenswrapper[17876]: I0313 10:41:54.048605 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-netd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048672 master-0 kubenswrapper[17876]: I0313 10:41:54.048636 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.048743 master-0 kubenswrapper[17876]: I0313 10:41:54.048719 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.048780 master-0 kubenswrapper[17876]: I0313 10:41:54.048609 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.048844 master-0 kubenswrapper[17876]: I0313 10:41:54.048826 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.048934 master-0 kubenswrapper[17876]: I0313 10:41:54.048913 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.048975 master-0 kubenswrapper[17876]: I0313 10:41:54.048960 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.048975 master-0 kubenswrapper[17876]: I0313 10:41:54.048965 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.049037 master-0 kubenswrapper[17876]: I0313 10:41:54.048994 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.049037 master-0 kubenswrapper[17876]: I0313 10:41:54.049019 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.049089 master-0 kubenswrapper[17876]: I0313 10:41:54.049045 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.049089 master-0 kubenswrapper[17876]: I0313 10:41:54.049050 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:54.049089 master-0 kubenswrapper[17876]: I0313 10:41:54.049080 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.049192 master-0 kubenswrapper[17876]: I0313 10:41:54.049114 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-system-cni-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.049192 master-0 kubenswrapper[17876]: I0313 10:41:54.049143 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:54.049192 master-0 kubenswrapper[17876]: I0313 10:41:54.049165 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:54.049192 master-0 kubenswrapper[17876]: I0313 10:41:54.049189 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.049299 master-0 kubenswrapper[17876]: I0313 10:41:54.049278 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.049330 master-0 kubenswrapper[17876]: I0313 10:41:54.049304 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-multus\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.049359 master-0 kubenswrapper[17876]: I0313 10:41:54.049344 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.049397 master-0 kubenswrapper[17876]: I0313 10:41:54.049371 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.049428 master-0 kubenswrapper[17876]: I0313 10:41:54.049408 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:54.049458 master-0 kubenswrapper[17876]: I0313 10:41:54.049445 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.049495 master-0 kubenswrapper[17876]: I0313 10:41:54.049480 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.049535 master-0 kubenswrapper[17876]: I0313 10:41:54.049523 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:41:54.049600 master-0 kubenswrapper[17876]: I0313 10:41:54.049586 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:54.049638 master-0 kubenswrapper[17876]: I0313 10:41:54.049601 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.049669 master-0 kubenswrapper[17876]: I0313 10:41:54.049655 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.049703 master-0 kubenswrapper[17876]: I0313 10:41:54.049676 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.049703 master-0 kubenswrapper[17876]: I0313 10:41:54.049676 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-hosts-file\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:41:54.049703 master-0 kubenswrapper[17876]: I0313 10:41:54.049700 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049742 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-systemd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049752 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049776 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049785 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysconfig\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049801 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049809 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.049847 master-0 kubenswrapper[17876]: I0313 10:41:54.049843 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 13 10:41:54.050044 master-0 kubenswrapper[17876]: I0313 10:41:54.049868 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.050044 master-0 kubenswrapper[17876]: I0313 10:41:54.049918 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.050044 master-0 kubenswrapper[17876]: I0313 10:41:54.049985 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:54.050044 master-0 kubenswrapper[17876]: I0313 10:41:54.050009 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.050044 master-0 kubenswrapper[17876]: I0313 10:41:54.050031 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.050262 master-0 kubenswrapper[17876]: I0313 10:41:54.050057 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:54.050262 master-0 kubenswrapper[17876]: I0313 10:41:54.050070 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/8d2fdba3-9478-4165-9207-d01483625607-host-etc-kube\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:54.050262 master-0 kubenswrapper[17876]: I0313 10:41:54.050083 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050262 master-0 kubenswrapper[17876]: I0313 10:41:54.050248 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-conf-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050133 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-run\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050273 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050170 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/b5ed7aff-47c0-42f3-9a26-9385d2bde582-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050153 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050252 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-log-socket\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.050428 master-0 kubenswrapper[17876]: I0313 10:41:54.050384 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050408 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050467 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050494 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050508 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050515 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050527 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-socket-dir-parent\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050560 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050593 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-os-release\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050622 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.050660 master-0 kubenswrapper[17876]: I0313 10:41:54.050645 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050625 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-kubernetes\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050696 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-hostroot\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050737 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050759 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050794 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-systemd\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050793 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-cnibin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050799 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-kubelet\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050831 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050860 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050899 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-systemd-units\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050915 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050978 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.051039 master-0 kubenswrapper[17876]: I0313 10:41:54.050971 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-audit-dir\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051077 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051198 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051224 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051241 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/ec33c506-8abe-4659-84d3-a294c31b446c-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051251 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051305 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051307 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051367 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051394 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23e2957-3a22-44f6-937c-5ab6314681c0-host\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051407 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-var-lib-kubelet\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051487 master-0 kubenswrapper[17876]: I0313 10:41:54.051418 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.051766 master-0 kubenswrapper[17876]: I0313 10:41:54.051494 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d23e2957-3a22-44f6-937c-5ab6314681c0-host\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:54.051766 master-0 kubenswrapper[17876]: I0313 10:41:54.051613 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert\") pod \"monitoring-plugin-6558455fc8-8qww9\" (UID: \"5c65aadf-c6fc-4959-9366-3e9d378bb507\") " pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:54.051766 master-0 kubenswrapper[17876]: I0313 10:41:54.051667 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051766 master-0 kubenswrapper[17876]: I0313 10:41:54.051703 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051766 master-0 kubenswrapper[17876]: I0313 10:41:54.051736 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051781 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-modprobe-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051788 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051857 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051893 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051919 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051943 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051962 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.051980 master-0 kubenswrapper[17876]: I0313 10:41:54.051969 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052007 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-conf\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052017 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-var-lib-cni-bin\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052006 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/0529b217-a9ef-48fb-b40a-b6789c640c20-rootfs\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052014 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-system-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052033 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hfjxf\" (UniqueName: \"kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052076 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052075 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-etc-kubernetes\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.052251 master-0 kubenswrapper[17876]: I0313 10:41:54.052230 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052268 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052307 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052332 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052336 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052351 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052377 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052395 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052403 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-run-ovn\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052405 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052416 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t6hq5\" (UniqueName: \"kubernetes.io/projected/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-kube-api-access-t6hq5\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052463 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.052469 master-0 kubenswrapper[17876]: I0313 10:41:54.052478 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-host\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052499 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052509 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052514 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-lib-modules\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052512 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-tuning-conf-dir\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052537 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052572 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052608 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/018c9219-d314-4408-ac39-93475d87eefb-node-pullsecrets\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052612 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052634 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-netns\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052690 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052712 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052746 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-cni-bin\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052771 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-multus-cni-dir\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052837 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052864 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052892 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtq5b\" (UniqueName: \"kubernetes.io/projected/c179be5b-2517-4ae5-9c30-2d4415899123-kube-api-access-jtq5b\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052921 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052930 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052928 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052958 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052990 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/a3a72b45-a705-4335-9c04-c952ec5d9975-etc-sysctl-d\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:54.053020 master-0 kubenswrapper[17876]: I0313 10:41:54.052993 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-run-ovn-kubernetes\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053113 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053154 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053183 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053187 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/db9faadf-74e9-4a7f-b3a6-902dd14ac978-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053239 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/b7090328-1191-4c7c-afed-603d7333014f-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053272 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053300 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053329 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053409 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/cc66541c-6410-4824-b173-53747069429e-os-release\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053429 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053445 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053509 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053545 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-host-kubelet\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053588 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053621 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053633 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-etc-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053648 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053672 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053718 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-node-log\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053720 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/58685de6-b4ae-4229-870b-5143a6010450-host-slash\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053762 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053788 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053837 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053845 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.053939 master-0 kubenswrapper[17876]: I0313 10:41:54.053878 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfxbl\" (UniqueName: \"kubernetes.io/projected/d23e2957-3a22-44f6-937c-5ab6314681c0-kube-api-access-mfxbl\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054070 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054117 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-dir\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054148 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054203 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054233 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fb060653-0d4b-4759-a7a1-c5dce194cce7-var-lib-openvswitch\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054257 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054265 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054320 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054409 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-host-run-multus-certs\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054431 17876 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/277614e8-838f-4773-bcfc-89f19c620dee-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:54.054725 master-0 kubenswrapper[17876]: I0313 10:41:54.054465 17876 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/277614e8-838f-4773-bcfc-89f19c620dee-ready\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:54.062771 master-0 kubenswrapper[17876]: I0313 10:41:54.062728 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 10:41:54.093279 master-0 kubenswrapper[17876]: I0313 10:41:54.093235 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 10:41:54.102674 master-0 kubenswrapper[17876]: I0313 10:41:54.102624 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 10:41:54.104064 master-0 kubenswrapper[17876]: I0313 10:41:54.104007 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/db9faadf-74e9-4a7f-b3a6-902dd14ac978-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.124465 master-0 kubenswrapper[17876]: I0313 10:41:54.124399 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 10:41:54.125282 master-0 kubenswrapper[17876]: I0313 10:41:54.125259 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-6gzxr_5da919b6-8545-4001-89f3-74cb289327f0/multus-admission-controller/0.log" Mar 13 10:41:54.125362 master-0 kubenswrapper[17876]: I0313 10:41:54.125320 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:41:54.131942 master-0 kubenswrapper[17876]: I0313 10:41:54.131891 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142279 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142428 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142483 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142513 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142542 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.142573 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.144461 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 10:41:54.148254 master-0 kubenswrapper[17876]: I0313 10:41:54.148195 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.149708 master-0 kubenswrapper[17876]: I0313 10:41:54.148995 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:54.155233 master-0 kubenswrapper[17876]: I0313 10:41:54.155199 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") pod \"277614e8-838f-4773-bcfc-89f19c620dee\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " Mar 13 10:41:54.156127 master-0 kubenswrapper[17876]: I0313 10:41:54.156033 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "277614e8-838f-4773-bcfc-89f19c620dee" (UID: "277614e8-838f-4773-bcfc-89f19c620dee"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:54.156222 master-0 kubenswrapper[17876]: I0313 10:41:54.156175 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 10:41:54.156493 master-0 kubenswrapper[17876]: I0313 10:41:54.156430 17876 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/277614e8-838f-4773-bcfc-89f19c620dee-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:54.162686 master-0 kubenswrapper[17876]: I0313 10:41:54.162628 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 10:41:54.195029 master-0 kubenswrapper[17876]: I0313 10:41:54.194944 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 10:41:54.202667 master-0 kubenswrapper[17876]: I0313 10:41:54.202587 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 10:41:54.207399 master-0 kubenswrapper[17876]: I0313 10:41:54.207344 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:54.222220 master-0 kubenswrapper[17876]: I0313 10:41:54.222137 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 10:41:54.226923 master-0 kubenswrapper[17876]: I0313 10:41:54.226842 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/258f571e-5ec8-42df-b4ba-17457d87d10d-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-85vcp\" (UID: \"258f571e-5ec8-42df-b4ba-17457d87d10d\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:54.242765 master-0 kubenswrapper[17876]: I0313 10:41:54.242711 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:41:54.251387 master-0 kubenswrapper[17876]: I0313 10:41:54.251330 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-etcd-serving-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.259296 master-0 kubenswrapper[17876]: I0313 10:41:54.259215 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-7z94w_277614e8-838f-4773-bcfc-89f19c620dee/kube-multus-additional-cni-plugins/0.log" Mar 13 10:41:54.259509 master-0 kubenswrapper[17876]: I0313 10:41:54.259332 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" event={"ID":"277614e8-838f-4773-bcfc-89f19c620dee","Type":"ContainerDied","Data":"2a3ae0ef1861ea401e0b8a9b1d8fd796b2315f2b16e1b237d258aa72508e4e53"} Mar 13 10:41:54.259509 master-0 kubenswrapper[17876]: I0313 10:41:54.259366 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:54.259509 master-0 kubenswrapper[17876]: I0313 10:41:54.259411 17876 scope.go:117] "RemoveContainer" containerID="e7184f84cd4474e6a6cc53b836b41501e5b07f8ddbe2de4a87f3c6adbc3bb1eb" Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.261864 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-6gzxr_5da919b6-8545-4001-89f3-74cb289327f0/multus-admission-controller/0.log" Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.262729 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.262775 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.262993 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" event={"ID":"5da919b6-8545-4001-89f3-74cb289327f0","Type":"ContainerDied","Data":"9eb81fef2a10fdac9c228bb26aef29e151ecfe34e45ad78b6841550ead2dd190"} Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.263585 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.265290 master-0 kubenswrapper[17876]: I0313 10:41:54.264334 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-encryption-config\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.275243 master-0 kubenswrapper[17876]: I0313 10:41:54.275189 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:54.283034 master-0 kubenswrapper[17876]: I0313 10:41:54.282989 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 10:41:54.283298 master-0 kubenswrapper[17876]: I0313 10:41:54.283258 17876 scope.go:117] "RemoveContainer" containerID="a00b9478d33bfd54d41596723606f59efdd5a5faf516c48ad42c690af80911c4" Mar 13 10:41:54.305512 master-0 kubenswrapper[17876]: I0313 10:41:54.300960 17876 scope.go:117] "RemoveContainer" containerID="2276fd8efc0fde40f37ca319cd91132fc15d5529319ce35ac0901720d64c7ce3" Mar 13 10:41:54.305512 master-0 kubenswrapper[17876]: I0313 10:41:54.302974 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 10:41:54.308989 master-0 kubenswrapper[17876]: I0313 10:41:54.308947 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-audit-policies\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.311492 master-0 kubenswrapper[17876]: I0313 10:41:54.311443 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.311705 master-0 kubenswrapper[17876]: I0313 10:41:54.311668 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.311920 master-0 kubenswrapper[17876]: I0313 10:41:54.311824 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.312001 master-0 kubenswrapper[17876]: I0313 10:41:54.311986 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.319498 master-0 kubenswrapper[17876]: I0313 10:41:54.319442 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:41:54.323757 master-0 kubenswrapper[17876]: I0313 10:41:54.323720 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 10:41:54.330060 master-0 kubenswrapper[17876]: I0313 10:41:54.330001 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-serving-ca\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.343291 master-0 kubenswrapper[17876]: I0313 10:41:54.343076 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 10:41:54.358924 master-0 kubenswrapper[17876]: I0313 10:41:54.358860 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:41:54.359135 master-0 kubenswrapper[17876]: I0313 10:41:54.358972 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock" (OuterVolumeSpecName: "var-lock") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:54.359135 master-0 kubenswrapper[17876]: I0313 10:41:54.359115 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:41:54.359579 master-0 kubenswrapper[17876]: I0313 10:41:54.359513 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:41:54.361868 master-0 kubenswrapper[17876]: I0313 10:41:54.361622 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:54.361868 master-0 kubenswrapper[17876]: I0313 10:41:54.361656 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3b44838d-cfe0-42fe-9927-d0b5391eee81-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:54.362485 master-0 kubenswrapper[17876]: I0313 10:41:54.362438 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 10:41:54.369837 master-0 kubenswrapper[17876]: I0313 10:41:54.369797 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b57f1c19-f44a-4405-8135-79aef1d1ce07-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:41:54.383323 master-0 kubenswrapper[17876]: I0313 10:41:54.383253 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:41:54.389125 master-0 kubenswrapper[17876]: I0313 10:41:54.389060 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-etcd-client\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.402450 master-0 kubenswrapper[17876]: I0313 10:41:54.402360 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:41:54.404450 master-0 kubenswrapper[17876]: I0313 10:41:54.404385 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-serving-cert\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.409172 master-0 kubenswrapper[17876]: I0313 10:41:54.408925 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:54.413573 master-0 kubenswrapper[17876]: I0313 10:41:54.413527 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-85vcp" Mar 13 10:41:54.423545 master-0 kubenswrapper[17876]: I0313 10:41:54.423496 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 10:41:54.430722 master-0 kubenswrapper[17876]: I0313 10:41:54.430681 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-metrics-certs\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:54.443214 master-0 kubenswrapper[17876]: I0313 10:41:54.443147 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 10:41:54.451644 master-0 kubenswrapper[17876]: I0313 10:41:54.451599 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-default-certificate\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:54.463072 master-0 kubenswrapper[17876]: I0313 10:41:54.462860 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 10:41:54.470761 master-0 kubenswrapper[17876]: I0313 10:41:54.470680 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-stats-auth\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:54.483479 master-0 kubenswrapper[17876]: I0313 10:41:54.482721 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 10:41:54.488651 master-0 kubenswrapper[17876]: I0313 10:41:54.488591 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-etcd-client\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.501218 master-0 kubenswrapper[17876]: I0313 10:41:54.501150 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 13 10:41:54.502459 master-0 kubenswrapper[17876]: I0313 10:41:54.502427 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 10:41:54.503629 master-0 kubenswrapper[17876]: I0313 10:41:54.503595 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-service-ca-bundle\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:54.523184 master-0 kubenswrapper[17876]: I0313 10:41:54.523107 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 10:41:54.528580 master-0 kubenswrapper[17876]: I0313 10:41:54.528523 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-serving-cert\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.543070 master-0 kubenswrapper[17876]: I0313 10:41:54.543009 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:41:54.553256 master-0 kubenswrapper[17876]: I0313 10:41:54.553195 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.562895 master-0 kubenswrapper[17876]: I0313 10:41:54.562697 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:41:54.565765 master-0 kubenswrapper[17876]: I0313 10:41:54.565713 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-image-import-ca\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.583063 master-0 kubenswrapper[17876]: I0313 10:41:54.582974 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 10:41:54.603295 master-0 kubenswrapper[17876]: I0313 10:41:54.602860 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 10:41:54.622515 master-0 kubenswrapper[17876]: I0313 10:41:54.622428 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 10:41:54.629693 master-0 kubenswrapper[17876]: I0313 10:41:54.629646 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-trusted-ca-bundle\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:54.643394 master-0 kubenswrapper[17876]: I0313 10:41:54.643322 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:41:54.648616 master-0 kubenswrapper[17876]: I0313 10:41:54.648566 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-audit\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:54.663527 master-0 kubenswrapper[17876]: I0313 10:41:54.663465 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:41:54.682892 master-0 kubenswrapper[17876]: I0313 10:41:54.682832 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:41:54.685002 master-0 kubenswrapper[17876]: I0313 10:41:54.684531 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:54.702848 master-0 kubenswrapper[17876]: I0313 10:41:54.702781 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 10:41:54.703693 master-0 kubenswrapper[17876]: I0313 10:41:54.703655 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/06ecac2e-bffa-474b-a824-9ba4a194159a-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:41:54.722994 master-0 kubenswrapper[17876]: I0313 10:41:54.722811 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 10:41:54.742670 master-0 kubenswrapper[17876]: I0313 10:41:54.742611 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:41:54.745505 master-0 kubenswrapper[17876]: I0313 10:41:54.745466 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:54.762256 master-0 kubenswrapper[17876]: I0313 10:41:54.762206 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:41:54.784262 master-0 kubenswrapper[17876]: I0313 10:41:54.783565 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:41:54.802583 master-0 kubenswrapper[17876]: I0313 10:41:54.802495 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:41:54.821881 master-0 kubenswrapper[17876]: I0313 10:41:54.821686 17876 request.go:700] Waited for 1.010886183s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-route-controller-manager/secrets?fieldSelector=metadata.name%3Dserving-cert&limit=500&resourceVersion=0 Mar 13 10:41:54.823166 master-0 kubenswrapper[17876]: I0313 10:41:54.823061 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:41:54.827438 master-0 kubenswrapper[17876]: I0313 10:41:54.827345 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:54.842862 master-0 kubenswrapper[17876]: I0313 10:41:54.842801 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:41:54.866554 master-0 kubenswrapper[17876]: I0313 10:41:54.866472 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 10:41:54.882768 master-0 kubenswrapper[17876]: I0313 10:41:54.882687 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:41:54.886482 master-0 kubenswrapper[17876]: I0313 10:41:54.886433 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b5ed7aff-47c0-42f3-9a26-9385d2bde582-serving-cert\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.902523 master-0 kubenswrapper[17876]: I0313 10:41:54.902440 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:41:54.907083 master-0 kubenswrapper[17876]: I0313 10:41:54.907016 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/b5ed7aff-47c0-42f3-9a26-9385d2bde582-service-ca\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:54.922385 master-0 kubenswrapper[17876]: I0313 10:41:54.922325 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 10:41:54.935454 master-0 kubenswrapper[17876]: E0313 10:41:54.935392 17876 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935454 master-0 kubenswrapper[17876]: E0313 10:41:54.935427 17876 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935481 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca podName:3bf5e05a-443b-41dc-b464-3d2f1ace50a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435462158 +0000 UTC m=+23.271268724 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca") pod "controller-manager-79847c4f97-tf57f" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935503 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config podName:61427254-6722-4d1a-a96a-dadd24abbe94 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435495429 +0000 UTC m=+23.271301905 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-c65k4" (UID: "61427254-6722-4d1a-a96a-dadd24abbe94") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935543 17876 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935576 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config podName:a0917212-59d8-4799-a9bc-52e358c5e8a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435568801 +0000 UTC m=+23.271375277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config") pod "machine-api-operator-84bf6db4f9-svqcp" (UID: "a0917212-59d8-4799-a9bc-52e358c5e8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935609 17876 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935637 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle podName:9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435629213 +0000 UTC m=+23.271435689 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle") pod "insights-operator-8f89dfddd-v9x5b" (UID: "9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935852 17876 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.935879 master-0 kubenswrapper[17876]: E0313 10:41:54.935873 17876 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935903 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls podName:2563ecb2-5783-4c45-a7f6-180e14e1c8c4 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.43589328 +0000 UTC m=+23.271699756 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-82x6j" (UID: "2563ecb2-5783-4c45-a7f6-180e14e1c8c4") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935907 17876 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935932 17876 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935948 17876 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935937 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert podName:97328e01-1227-417e-9af7-6426495d96db nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435913171 +0000 UTC m=+23.271719647 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert") pod "packageserver-85b658d7fb-45fq6" (UID: "97328e01-1227-417e-9af7-6426495d96db") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935981 17876 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.935989 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs podName:9ca1b7c7-41af-46e9-8f5d-a476ee2b7587 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435978323 +0000 UTC m=+23.271784799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs") pod "multus-admission-controller-7769569c45-6lqz5" (UID: "9ca1b7c7-41af-46e9-8f5d-a476ee2b7587") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936004 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config podName:8dc7af5f-ff72-4f06-88df-a26ff4c0bded nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.435996793 +0000 UTC m=+23.271803269 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config") pod "machine-approver-754bdc9f9d-942bv" (UID: "8dc7af5f-ff72-4f06-88df-a26ff4c0bded") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936020 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert podName:0881de70-2db3-4fc2-b976-b55c11dc239d nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.436012144 +0000 UTC m=+23.271818620 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert") pod "cluster-baremetal-operator-5cdb4c5598-2c4sl" (UID: "0881de70-2db3-4fc2-b976-b55c11dc239d") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936040 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls podName:b7090328-1191-4c7c-afed-603d7333014f nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.436033314 +0000 UTC m=+23.271839790 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" (UID: "b7090328-1191-4c7c-afed-603d7333014f") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936063 17876 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936117 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls podName:7748068f-7409-4972-81d2-84cfb52b7af0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.436090656 +0000 UTC m=+23.271897132 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-zpqlc" (UID: "7748068f-7409-4972-81d2-84cfb52b7af0") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936158 17876 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936191 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config podName:9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.436183229 +0000 UTC m=+23.271989835 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config") pod "machine-config-controller-ff46b7bdf-zx8pp" (UID: "9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936676 17876 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936719 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config podName:3bf5e05a-443b-41dc-b464-3d2f1ace50a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.436708204 +0000 UTC m=+23.272514680 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config") pod "controller-manager-79847c4f97-tf57f" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.936980 17876 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.937017 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config podName:e4b55ebf-cab8-4985-95cc-b28bc5ae0578 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.437009333 +0000 UTC m=+23.272815809 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config") pod "cluster-autoscaler-operator-69576476f7-p7qlt" (UID: "e4b55ebf-cab8-4985-95cc-b28bc5ae0578") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.937043 17876 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.937740 master-0 kubenswrapper[17876]: E0313 10:41:54.937064 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert podName:e4b55ebf-cab8-4985-95cc-b28bc5ae0578 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.437058734 +0000 UTC m=+23.272865200 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert") pod "cluster-autoscaler-operator-69576476f7-p7qlt" (UID: "e4b55ebf-cab8-4985-95cc-b28bc5ae0578") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.938219 17876 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.938284 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token podName:161beda5-f575-4e60-8baa-5262a4fe86c7 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.43827093 +0000 UTC m=+23.274077416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token") pod "machine-config-server-zkmjs" (UID: "161beda5-f575-4e60-8baa-5262a4fe86c7") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.939316 17876 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.939353 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert podName:97328e01-1227-417e-9af7-6426495d96db nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439344322 +0000 UTC m=+23.275150798 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert") pod "packageserver-85b658d7fb-45fq6" (UID: "97328e01-1227-417e-9af7-6426495d96db") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.939383 17876 configmap.go:193] Couldn't get configMap openshift-apiserver/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.939401 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle podName:018c9219-d314-4408-ac39-93475d87eefb nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439396423 +0000 UTC m=+23.275202899 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle") pod "apiserver-576d4447f8-zqphk" (UID: "018c9219-d314-4408-ac39-93475d87eefb") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.939410 master-0 kubenswrapper[17876]: E0313 10:41:54.939415 17876 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939434 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config podName:7748068f-7409-4972-81d2-84cfb52b7af0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439429644 +0000 UTC m=+23.275236120 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config") pod "prometheus-operator-5ff8674d55-zpqlc" (UID: "7748068f-7409-4972-81d2-84cfb52b7af0") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939470 17876 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939500 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls podName:0529b217-a9ef-48fb-b40a-b6789c640c20 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439491706 +0000 UTC m=+23.275298182 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls") pod "machine-config-daemon-j9twr" (UID: "0529b217-a9ef-48fb-b40a-b6789c640c20") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939520 17876 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939526 17876 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939539 17876 configmap.go:193] Couldn't get configMap openshift-machine-api/cluster-baremetal-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939551 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert podName:b02805e2-f186-4e59-bdfa-f4793263b468 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439543357 +0000 UTC m=+23.275349833 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-55d85b7b47-qbgcg" (UID: "b02805e2-f186-4e59-bdfa-f4793263b468") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939571 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images podName:b7090328-1191-4c7c-afed-603d7333014f nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439559858 +0000 UTC m=+23.275366334 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" (UID: "b7090328-1191-4c7c-afed-603d7333014f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939580 17876 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-kube-rbac-proxy-config: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939582 17876 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939591 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images podName:0881de70-2db3-4fc2-b976-b55c11dc239d nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439582909 +0000 UTC m=+23.275389385 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images") pod "cluster-baremetal-operator-5cdb4c5598-2c4sl" (UID: "0881de70-2db3-4fc2-b976-b55c11dc239d") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939620 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls podName:0881de70-2db3-4fc2-b976-b55c11dc239d nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439607509 +0000 UTC m=+23.275413985 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-2c4sl" (UID: "0881de70-2db3-4fc2-b976-b55c11dc239d") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939644 17876 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939719 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439701172 +0000 UTC m=+23.275507678 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-kube-rbac-proxy-config" (UniqueName: "kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939736 17876 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939752 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls podName:a0917212-59d8-4799-a9bc-52e358c5e8a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439739773 +0000 UTC m=+23.275546249 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-svqcp" (UID: "a0917212-59d8-4799-a9bc-52e358c5e8a0") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939654 17876 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939893 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca podName:b02805e2-f186-4e59-bdfa-f4793263b468 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439863437 +0000 UTC m=+23.275669953 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca") pod "cloud-credential-operator-55d85b7b47-qbgcg" (UID: "b02805e2-f186-4e59-bdfa-f4793263b468") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.939929 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle podName:9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.439918838 +0000 UTC m=+23.275725314 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle") pod "insights-operator-8f89dfddd-v9x5b" (UID: "9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.940139 17876 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.940369 master-0 kubenswrapper[17876]: E0313 10:41:54.940179 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images podName:61427254-6722-4d1a-a96a-dadd24abbe94 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.440169796 +0000 UTC m=+23.275976382 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images") pod "machine-config-operator-fdb5c78b5-c65k4" (UID: "61427254-6722-4d1a-a96a-dadd24abbe94") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940468 17876 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940563 17876 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940596 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images podName:a0917212-59d8-4799-a9bc-52e358c5e8a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.440572368 +0000 UTC m=+23.276378934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images") pod "machine-api-operator-84bf6db4f9-svqcp" (UID: "a0917212-59d8-4799-a9bc-52e358c5e8a0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940493 17876 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940624 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.440615469 +0000 UTC m=+23.276422065 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940651 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls podName:3e15f776-d153-4289-91c7-893584104185 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.440634519 +0000 UTC m=+23.276441135 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls") pod "dns-default-qt95m" (UID: "3e15f776-d153-4289-91c7-893584104185") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940653 17876 secret.go:189] Couldn't get secret openshift-apiserver/encryption-config-1: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940706 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config podName:018c9219-d314-4408-ac39-93475d87eefb nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.440696151 +0000 UTC m=+23.276502757 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "encryption-config" (UniqueName: "kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config") pod "apiserver-576d4447f8-zqphk" (UID: "018c9219-d314-4408-ac39-93475d87eefb") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.940997 17876 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941034 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert podName:9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.441023701 +0000 UTC m=+23.276830177 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert") pod "insights-operator-8f89dfddd-v9x5b" (UID: "9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941038 17876 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941077 17876 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941151 17876 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941159 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config podName:8dc7af5f-ff72-4f06-88df-a26ff4c0bded nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.441089063 +0000 UTC m=+23.276895619 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config") pod "machine-approver-754bdc9f9d-942bv" (UID: "8dc7af5f-ff72-4f06-88df-a26ff4c0bded") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941229 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs podName:161beda5-f575-4e60-8baa-5262a4fe86c7 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.441215796 +0000 UTC m=+23.277022282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs") pod "machine-config-server-zkmjs" (UID: "161beda5-f575-4e60-8baa-5262a4fe86c7") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.941255 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls podName:61427254-6722-4d1a-a96a-dadd24abbe94 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.441247727 +0000 UTC m=+23.277054213 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls") pod "machine-config-operator-fdb5c78b5-c65k4" (UID: "61427254-6722-4d1a-a96a-dadd24abbe94") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: I0313 10:41:54.942658 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943135 17876 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943163 17876 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943183 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls podName:8dc7af5f-ff72-4f06-88df-a26ff4c0bded nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443170704 +0000 UTC m=+23.278977270 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls") pod "machine-approver-754bdc9f9d-942bv" (UID: "8dc7af5f-ff72-4f06-88df-a26ff4c0bded") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943206 17876 configmap.go:193] Couldn't get configMap openshift-monitoring/kube-state-metrics-custom-resource-state-configmap: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943235 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config podName:b7090328-1191-4c7c-afed-603d7333014f nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443215795 +0000 UTC m=+23.279022361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" (UID: "b7090328-1191-4c7c-afed-603d7333014f") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943246 17876 configmap.go:193] Couldn't get configMap openshift-monitoring/metrics-client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943300 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443284677 +0000 UTC m=+23.279091223 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-custom-resource-state-configmap" (UniqueName: "kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943210 17876 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943326 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca podName:7748068f-7409-4972-81d2-84cfb52b7af0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443315608 +0000 UTC m=+23.279122084 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-client-ca" (UniqueName: "kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca") pod "prometheus-operator-5ff8674d55-zpqlc" (UID: "7748068f-7409-4972-81d2-84cfb52b7af0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943265 17876 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943435 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config podName:0529b217-a9ef-48fb-b40a-b6789c640c20 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443428781 +0000 UTC m=+23.279235247 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config") pod "machine-config-daemon-j9twr" (UID: "0529b217-a9ef-48fb-b40a-b6789c640c20") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943467 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs podName:5da919b6-8545-4001-89f3-74cb289327f0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443450322 +0000 UTC m=+23.279256888 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs") pod "multus-admission-controller-8d675b596-6gzxr" (UID: "5da919b6-8545-4001-89f3-74cb289327f0") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943908 17876 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943941 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls podName:9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443932986 +0000 UTC m=+23.279739462 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls") pod "machine-config-controller-ff46b7bdf-zx8pp" (UID: "9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943966 17876 configmap.go:193] Couldn't get configMap openshift-dns/dns-default: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.944744 master-0 kubenswrapper[17876]: E0313 10:41:54.943987 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume podName:3e15f776-d153-4289-91c7-893584104185 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.443981307 +0000 UTC m=+23.279787773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config-volume" (UniqueName: "kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume") pod "dns-default-qt95m" (UID: "3e15f776-d153-4289-91c7-893584104185") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945144 17876 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945215 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert podName:3bf5e05a-443b-41dc-b464-3d2f1ace50a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.445195363 +0000 UTC m=+23.281001909 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert") pod "controller-manager-79847c4f97-tf57f" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945725 17876 configmap.go:193] Couldn't get configMap openshift-machine-api/baremetal-kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945751 17876 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945799 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles podName:3bf5e05a-443b-41dc-b464-3d2f1ace50a0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.44578712 +0000 UTC m=+23.281593596 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles") pod "controller-manager-79847c4f97-tf57f" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.949091 master-0 kubenswrapper[17876]: E0313 10:41:54.945819 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config podName:0881de70-2db3-4fc2-b976-b55c11dc239d nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.445808561 +0000 UTC m=+23.281615037 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config") pod "cluster-baremetal-operator-5cdb4c5598-2c4sl" (UID: "0881de70-2db3-4fc2-b976-b55c11dc239d") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:54.962955 master-0 kubenswrapper[17876]: I0313 10:41:54.962889 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-r9v82" Mar 13 10:41:54.975022 master-0 kubenswrapper[17876]: I0313 10:41:54.974621 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") pod \"5da919b6-8545-4001-89f3-74cb289327f0\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " Mar 13 10:41:54.978272 master-0 kubenswrapper[17876]: I0313 10:41:54.978127 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "5da919b6-8545-4001-89f3-74cb289327f0" (UID: "5da919b6-8545-4001-89f3-74cb289327f0"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:41:54.982865 master-0 kubenswrapper[17876]: I0313 10:41:54.982823 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 10:41:55.002371 master-0 kubenswrapper[17876]: I0313 10:41:55.002219 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 10:41:55.030918 master-0 kubenswrapper[17876]: I0313 10:41:55.030852 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:41:55.042088 master-0 kubenswrapper[17876]: I0313 10:41:55.042024 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 10:41:55.048763 master-0 kubenswrapper[17876]: E0313 10:41:55.048580 17876 configmap.go:193] Couldn't get configMap openshift-authentication/audit: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.048763 master-0 kubenswrapper[17876]: E0313 10:41:55.048723 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.548694057 +0000 UTC m=+23.384500543 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit-policies" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049646 17876 secret.go:189] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049713 17876 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049785 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.549760148 +0000 UTC m=+23.385566624 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049795 17876 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049829 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.54980139 +0000 UTC m=+23.385607886 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049732 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049862 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.549844611 +0000 UTC m=+23.385651107 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049869 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.049915 master-0 kubenswrapper[17876]: E0313 10:41:55.049877 17876 configmap.go:193] Couldn't get configMap openshift-image-registry/image-registry-certificates: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.049893 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.549881752 +0000 UTC m=+23.385688238 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.049976 17876 configmap.go:193] Couldn't get configMap openshift-console-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.049995 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.549971835 +0000 UTC m=+23.385778351 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-serving-cert" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.050038 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca podName:d23e2957-3a22-44f6-937c-5ab6314681c0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.550018636 +0000 UTC m=+23.385825142 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serviceca" (UniqueName: "kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca") pod "node-ca-trztz" (UID: "d23e2957-3a22-44f6-937c-5ab6314681c0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.050071 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.550055107 +0000 UTC m=+23.385861693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.050393 master-0 kubenswrapper[17876]: E0313 10:41:55.050379 17876 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.050659 master-0 kubenswrapper[17876]: E0313 10:41:55.050421 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert podName:c179be5b-2517-4ae5-9c30-2d4415899123 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.550410117 +0000 UTC m=+23.386216663 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert") pod "ingress-canary-p5ncj" (UID: "c179be5b-2517-4ae5-9c30-2d4415899123") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052349 master-0 kubenswrapper[17876]: E0313 10:41:55.052271 17876 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.052349 master-0 kubenswrapper[17876]: E0313 10:41:55.052345 17876 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052375 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.552352084 +0000 UTC m=+23.388158630 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052283 17876 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052400 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-ocp-branding-template: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052403 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.552391716 +0000 UTC m=+23.388198262 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052308 17876 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052450 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.552432157 +0000 UTC m=+23.388238723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-ocp-branding-template" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052466 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls podName:7c5279e3-0165-4347-bfc7-87b80accaab3 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.552458638 +0000 UTC m=+23.388265114 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls") pod "kube-state-metrics-68b88f8cb5-dw9w6" (UID: "7c5279e3-0165-4347-bfc7-87b80accaab3") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.052499 master-0 kubenswrapper[17876]: E0313 10:41:55.052480 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert podName:5c65aadf-c6fc-4959-9366-3e9d378bb507 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.552473348 +0000 UTC m=+23.388279954 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert") pod "monitoring-plugin-6558455fc8-8qww9" (UID: "5c65aadf-c6fc-4959-9366-3e9d378bb507") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053722 master-0 kubenswrapper[17876]: E0313 10:41:55.053678 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-user-template-provider-selection: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053722 master-0 kubenswrapper[17876]: E0313 10:41:55.053705 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-user-template-login: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053835 master-0 kubenswrapper[17876]: E0313 10:41:55.053734 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.553719925 +0000 UTC m=+23.389526421 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-provider-selection" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053835 master-0 kubenswrapper[17876]: E0313 10:41:55.053738 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053835 master-0 kubenswrapper[17876]: E0313 10:41:55.053754 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.553746735 +0000 UTC m=+23.389553221 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-login" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.053835 master-0 kubenswrapper[17876]: E0313 10:41:55.053785 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.553773216 +0000 UTC m=+23.389579762 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.055195 master-0 kubenswrapper[17876]: E0313 10:41:55.055164 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-router-certs: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.055254 master-0 kubenswrapper[17876]: E0313 10:41:55.055219 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:55.555207498 +0000 UTC m=+23.391014054 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-router-certs" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:55.063318 master-0 kubenswrapper[17876]: I0313 10:41:55.063213 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 10:41:55.079252 master-0 kubenswrapper[17876]: I0313 10:41:55.079054 17876 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/5da919b6-8545-4001-89f3-74cb289327f0-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:55.082672 master-0 kubenswrapper[17876]: I0313 10:41:55.082629 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 10:41:55.102865 master-0 kubenswrapper[17876]: I0313 10:41:55.102818 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:41:55.127241 master-0 kubenswrapper[17876]: I0313 10:41:55.127193 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 10:41:55.142364 master-0 kubenswrapper[17876]: I0313 10:41:55.142287 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-d2pmx" Mar 13 10:41:55.162145 master-0 kubenswrapper[17876]: I0313 10:41:55.162061 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 10:41:55.182337 master-0 kubenswrapper[17876]: I0313 10:41:55.182275 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 10:41:55.202856 master-0 kubenswrapper[17876]: I0313 10:41:55.202808 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 10:41:55.222647 master-0 kubenswrapper[17876]: I0313 10:41:55.222580 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 10:41:55.242539 master-0 kubenswrapper[17876]: I0313 10:41:55.242460 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 10:41:55.263594 master-0 kubenswrapper[17876]: I0313 10:41:55.263512 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 10:41:55.271229 master-0 kubenswrapper[17876]: I0313 10:41:55.271032 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:55.283841 master-0 kubenswrapper[17876]: I0313 10:41:55.283785 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 10:41:55.303603 master-0 kubenswrapper[17876]: I0313 10:41:55.303520 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dvqsb" Mar 13 10:41:55.323204 master-0 kubenswrapper[17876]: I0313 10:41:55.323115 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 10:41:55.342489 master-0 kubenswrapper[17876]: I0313 10:41:55.342337 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 10:41:55.362559 master-0 kubenswrapper[17876]: I0313 10:41:55.362473 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 10:41:55.382346 master-0 kubenswrapper[17876]: I0313 10:41:55.382283 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 10:41:55.403333 master-0 kubenswrapper[17876]: I0313 10:41:55.403279 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:41:55.423280 master-0 kubenswrapper[17876]: I0313 10:41:55.423210 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:41:55.442498 master-0 kubenswrapper[17876]: I0313 10:41:55.442411 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-5pbvv" Mar 13 10:41:55.463509 master-0 kubenswrapper[17876]: I0313 10:41:55.463435 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:41:55.483384 master-0 kubenswrapper[17876]: I0313 10:41:55.483297 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:41:55.488083 master-0 kubenswrapper[17876]: I0313 10:41:55.488032 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:55.488421 master-0 kubenswrapper[17876]: I0313 10:41:55.488378 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:55.488478 master-0 kubenswrapper[17876]: I0313 10:41:55.488455 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3e15f776-d153-4289-91c7-893584104185-config-volume\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:55.488622 master-0 kubenswrapper[17876]: I0313 10:41:55.488585 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:55.488668 master-0 kubenswrapper[17876]: I0313 10:41:55.488627 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.488668 master-0 kubenswrapper[17876]: I0313 10:41:55.488663 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:55.488915 master-0 kubenswrapper[17876]: I0313 10:41:55.488851 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:55.488971 master-0 kubenswrapper[17876]: I0313 10:41:55.488912 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:55.489030 master-0 kubenswrapper[17876]: I0313 10:41:55.488990 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.489087 master-0 kubenswrapper[17876]: I0313 10:41:55.489042 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-config\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.489233 master-0 kubenswrapper[17876]: I0313 10:41:55.489180 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:55.489286 master-0 kubenswrapper[17876]: I0313 10:41:55.489259 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:55.489371 master-0 kubenswrapper[17876]: I0313 10:41:55.489345 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:55.489477 master-0 kubenswrapper[17876]: I0313 10:41:55.489449 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:55.489522 master-0 kubenswrapper[17876]: I0313 10:41:55.489490 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/b7090328-1191-4c7c-afed-603d7333014f-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.489557 master-0 kubenswrapper[17876]: I0313 10:41:55.489523 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:55.489592 master-0 kubenswrapper[17876]: I0313 10:41:55.489558 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:55.489592 master-0 kubenswrapper[17876]: I0313 10:41:55.489581 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:55.489653 master-0 kubenswrapper[17876]: I0313 10:41:55.489607 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.489653 master-0 kubenswrapper[17876]: I0313 10:41:55.489638 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:41:55.489714 master-0 kubenswrapper[17876]: I0313 10:41:55.489675 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.489801 master-0 kubenswrapper[17876]: I0313 10:41:55.489782 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-webhook-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:55.489867 master-0 kubenswrapper[17876]: I0313 10:41:55.489842 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:55.489901 master-0 kubenswrapper[17876]: I0313 10:41:55.489846 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.489932 master-0 kubenswrapper[17876]: I0313 10:41:55.489902 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.489975 master-0 kubenswrapper[17876]: I0313 10:41:55.489955 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:55.490027 master-0 kubenswrapper[17876]: I0313 10:41:55.489991 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:55.490027 master-0 kubenswrapper[17876]: I0313 10:41:55.489913 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:41:55.490140 master-0 kubenswrapper[17876]: I0313 10:41:55.490117 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-config\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.490140 master-0 kubenswrapper[17876]: I0313 10:41:55.490130 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/0881de70-2db3-4fc2-b976-b55c11dc239d-images\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.490211 master-0 kubenswrapper[17876]: I0313 10:41:55.490131 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.490211 master-0 kubenswrapper[17876]: I0313 10:41:55.490190 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.490306 master-0 kubenswrapper[17876]: I0313 10:41:55.490288 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:55.490339 master-0 kubenswrapper[17876]: I0313 10:41:55.490316 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:55.490445 master-0 kubenswrapper[17876]: I0313 10:41:55.490417 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:55.490495 master-0 kubenswrapper[17876]: I0313 10:41:55.490453 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:55.490495 master-0 kubenswrapper[17876]: I0313 10:41:55.490481 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.490590 master-0 kubenswrapper[17876]: I0313 10:41:55.490570 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/b02805e2-f186-4e59-bdfa-f4793263b468-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:55.490590 master-0 kubenswrapper[17876]: I0313 10:41:55.490573 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:55.490719 master-0 kubenswrapper[17876]: I0313 10:41:55.490650 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.490766 master-0 kubenswrapper[17876]: I0313 10:41:55.490712 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.490766 master-0 kubenswrapper[17876]: I0313 10:41:55.490746 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.490825 master-0 kubenswrapper[17876]: I0313 10:41:55.490792 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:55.490825 master-0 kubenswrapper[17876]: I0313 10:41:55.490813 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/3e15f776-d153-4289-91c7-893584104185-metrics-tls\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:55.490889 master-0 kubenswrapper[17876]: I0313 10:41:55.490820 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:55.490889 master-0 kubenswrapper[17876]: I0313 10:41:55.490874 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:55.490947 master-0 kubenswrapper[17876]: I0313 10:41:55.490933 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.490993 master-0 kubenswrapper[17876]: I0313 10:41:55.490973 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:55.491036 master-0 kubenswrapper[17876]: I0313 10:41:55.491018 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.491073 master-0 kubenswrapper[17876]: I0313 10:41:55.491044 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.491159 master-0 kubenswrapper[17876]: I0313 10:41:55.491087 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/97328e01-1227-417e-9af7-6426495d96db-apiservice-cert\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:55.491197 master-0 kubenswrapper[17876]: I0313 10:41:55.491140 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.491229 master-0 kubenswrapper[17876]: I0313 10:41:55.491208 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:55.491229 master-0 kubenswrapper[17876]: I0313 10:41:55.491216 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/018c9219-d314-4408-ac39-93475d87eefb-trusted-ca-bundle\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:55.491371 master-0 kubenswrapper[17876]: I0313 10:41:55.491346 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a0917212-59d8-4799-a9bc-52e358c5e8a0-images\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.491405 master-0 kubenswrapper[17876]: I0313 10:41:55.491381 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:55.491503 master-0 kubenswrapper[17876]: I0313 10:41:55.491480 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0917212-59d8-4799-a9bc-52e358c5e8a0-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:55.491542 master-0 kubenswrapper[17876]: I0313 10:41:55.491486 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.491809 master-0 kubenswrapper[17876]: I0313 10:41:55.491774 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b02805e2-f186-4e59-bdfa-f4793263b468-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:55.491851 master-0 kubenswrapper[17876]: I0313 10:41:55.491804 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/018c9219-d314-4408-ac39-93475d87eefb-encryption-config\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:55.491966 master-0 kubenswrapper[17876]: I0313 10:41:55.491917 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/0881de70-2db3-4fc2-b976-b55c11dc239d-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:55.491966 master-0 kubenswrapper[17876]: I0313 10:41:55.491932 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:55.492189 master-0 kubenswrapper[17876]: I0313 10:41:55.492149 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:55.492248 master-0 kubenswrapper[17876]: I0313 10:41:55.492212 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:55.492305 master-0 kubenswrapper[17876]: I0313 10:41:55.492284 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.503789 master-0 kubenswrapper[17876]: I0313 10:41:55.503728 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:41:55.513547 master-0 kubenswrapper[17876]: I0313 10:41:55.513490 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b7090328-1191-4c7c-afed-603d7333014f-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:55.523824 master-0 kubenswrapper[17876]: I0313 10:41:55.523767 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:41:55.529951 master-0 kubenswrapper[17876]: I0313 10:41:55.529895 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-webhook-certs\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:55.542155 master-0 kubenswrapper[17876]: I0313 10:41:55.542069 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-tvfvf" Mar 13 10:41:55.562737 master-0 kubenswrapper[17876]: I0313 10:41:55.562664 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-24kvc" Mar 13 10:41:55.582382 master-0 kubenswrapper[17876]: I0313 10:41:55.582325 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 10:41:55.593830 master-0 kubenswrapper[17876]: I0313 10:41:55.593688 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594027 master-0 kubenswrapper[17876]: I0313 10:41:55.593976 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594141 master-0 kubenswrapper[17876]: I0313 10:41:55.594047 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594273 master-0 kubenswrapper[17876]: I0313 10:41:55.594238 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594350 master-0 kubenswrapper[17876]: I0313 10:41:55.594341 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594438 master-0 kubenswrapper[17876]: I0313 10:41:55.594416 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594530 master-0 kubenswrapper[17876]: I0313 10:41:55.594508 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594599 master-0 kubenswrapper[17876]: I0313 10:41:55.594534 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594599 master-0 kubenswrapper[17876]: I0313 10:41:55.594575 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:55.594739 master-0 kubenswrapper[17876]: I0313 10:41:55.594645 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:55.594739 master-0 kubenswrapper[17876]: I0313 10:41:55.594671 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.594988 master-0 kubenswrapper[17876]: I0313 10:41:55.594942 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:55.595063 master-0 kubenswrapper[17876]: I0313 10:41:55.595040 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:55.595329 master-0 kubenswrapper[17876]: I0313 10:41:55.595289 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:55.595407 master-0 kubenswrapper[17876]: I0313 10:41:55.595341 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.595480 master-0 kubenswrapper[17876]: I0313 10:41:55.595436 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert\") pod \"monitoring-plugin-6558455fc8-8qww9\" (UID: \"5c65aadf-c6fc-4959-9366-3e9d378bb507\") " pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:55.595546 master-0 kubenswrapper[17876]: I0313 10:41:55.595489 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:55.595613 master-0 kubenswrapper[17876]: I0313 10:41:55.595544 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:55.603428 master-0 kubenswrapper[17876]: I0313 10:41:55.603375 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 10:41:55.614764 master-0 kubenswrapper[17876]: I0313 10:41:55.614444 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0529b217-a9ef-48fb-b40a-b6789c640c20-proxy-tls\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:55.623117 master-0 kubenswrapper[17876]: I0313 10:41:55.623021 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 10:41:55.630032 master-0 kubenswrapper[17876]: I0313 10:41:55.629978 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:55.631694 master-0 kubenswrapper[17876]: I0313 10:41:55.631625 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.633641 master-0 kubenswrapper[17876]: I0313 10:41:55.633609 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0529b217-a9ef-48fb-b40a-b6789c640c20-mcd-auth-proxy-config\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:55.641838 master-0 kubenswrapper[17876]: I0313 10:41:55.641764 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x4n7x" Mar 13 10:41:55.663327 master-0 kubenswrapper[17876]: I0313 10:41:55.663248 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 10:41:55.682400 master-0 kubenswrapper[17876]: I0313 10:41:55.682285 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 10:41:55.703613 master-0 kubenswrapper[17876]: I0313 10:41:55.703362 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 10:41:55.711232 master-0 kubenswrapper[17876]: I0313 10:41:55.711151 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-service-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.722560 master-0 kubenswrapper[17876]: I0313 10:41:55.722505 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-h7hlp" Mar 13 10:41:55.743262 master-0 kubenswrapper[17876]: I0313 10:41:55.743061 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 10:41:55.752624 master-0 kubenswrapper[17876]: I0313 10:41:55.752544 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-serving-cert\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.764373 master-0 kubenswrapper[17876]: I0313 10:41:55.764289 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 10:41:55.785181 master-0 kubenswrapper[17876]: I0313 10:41:55.785060 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 10:41:55.791171 master-0 kubenswrapper[17876]: I0313 10:41:55.791078 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/61427254-6722-4d1a-a96a-dadd24abbe94-images\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.807525 master-0 kubenswrapper[17876]: I0313 10:41:55.807465 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 10:41:55.810769 master-0 kubenswrapper[17876]: I0313 10:41:55.810716 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:55.822562 master-0 kubenswrapper[17876]: I0313 10:41:55.822490 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 10:41:55.831682 master-0 kubenswrapper[17876]: I0313 10:41:55.831620 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/61427254-6722-4d1a-a96a-dadd24abbe94-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:55.841330 master-0 kubenswrapper[17876]: I0313 10:41:55.841267 17876 request.go:700] Waited for 2.010266168s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-operator-dockercfg-t57pn&limit=500&resourceVersion=0 Mar 13 10:41:55.843087 master-0 kubenswrapper[17876]: I0313 10:41:55.843040 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-t57pn" Mar 13 10:41:55.863322 master-0 kubenswrapper[17876]: I0313 10:41:55.863194 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 10:41:55.869129 master-0 kubenswrapper[17876]: I0313 10:41:55.869071 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:55.883191 master-0 kubenswrapper[17876]: I0313 10:41:55.883137 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-jpsrw" Mar 13 10:41:55.903869 master-0 kubenswrapper[17876]: I0313 10:41:55.903805 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 10:41:55.911881 master-0 kubenswrapper[17876]: I0313 10:41:55.911835 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:55.922380 master-0 kubenswrapper[17876]: I0313 10:41:55.922318 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 10:41:55.930183 master-0 kubenswrapper[17876]: I0313 10:41:55.930135 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-cert\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:55.942402 master-0 kubenswrapper[17876]: I0313 10:41:55.942339 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-ff7d6" Mar 13 10:41:55.963076 master-0 kubenswrapper[17876]: I0313 10:41:55.963032 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:41:55.969949 master-0 kubenswrapper[17876]: I0313 10:41:55.969924 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:55.983008 master-0 kubenswrapper[17876]: I0313 10:41:55.982966 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:41:55.989666 master-0 kubenswrapper[17876]: I0313 10:41:55.989631 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:56.002351 master-0 kubenswrapper[17876]: I0313 10:41:56.002289 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-fsw7z" Mar 13 10:41:56.024806 master-0 kubenswrapper[17876]: I0313 10:41:56.023625 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:41:56.043452 master-0 kubenswrapper[17876]: I0313 10:41:56.043397 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:41:56.068744 master-0 kubenswrapper[17876]: I0313 10:41:56.068682 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:41:56.069941 master-0 kubenswrapper[17876]: I0313 10:41:56.069901 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:56.082347 master-0 kubenswrapper[17876]: I0313 10:41:56.082295 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:41:56.090487 master-0 kubenswrapper[17876]: I0313 10:41:56.090436 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:56.103292 master-0 kubenswrapper[17876]: I0313 10:41:56.103254 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:41:56.110135 master-0 kubenswrapper[17876]: I0313 10:41:56.110079 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:56.123462 master-0 kubenswrapper[17876]: I0313 10:41:56.123404 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:41:56.132413 master-0 kubenswrapper[17876]: I0313 10:41:56.132356 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:56.142314 master-0 kubenswrapper[17876]: I0313 10:41:56.142225 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:41:56.151422 master-0 kubenswrapper[17876]: I0313 10:41:56.151370 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-config\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:56.162545 master-0 kubenswrapper[17876]: I0313 10:41:56.162476 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:41:56.183154 master-0 kubenswrapper[17876]: I0313 10:41:56.183078 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:41:56.203512 master-0 kubenswrapper[17876]: I0313 10:41:56.203442 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-clxlg" Mar 13 10:41:56.223463 master-0 kubenswrapper[17876]: I0313 10:41:56.223406 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 10:41:56.231382 master-0 kubenswrapper[17876]: I0313 10:41:56.231339 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-certs\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:56.242814 master-0 kubenswrapper[17876]: I0313 10:41:56.242751 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 10:41:56.250523 master-0 kubenswrapper[17876]: I0313 10:41:56.250415 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/161beda5-f575-4e60-8baa-5262a4fe86c7-node-bootstrap-token\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:56.262304 master-0 kubenswrapper[17876]: I0313 10:41:56.262234 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-c2nqj" Mar 13 10:41:56.282997 master-0 kubenswrapper[17876]: I0313 10:41:56.282936 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 10:41:56.291230 master-0 kubenswrapper[17876]: I0313 10:41:56.291168 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:56.302858 master-0 kubenswrapper[17876]: I0313 10:41:56.302782 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 10:41:56.306285 master-0 kubenswrapper[17876]: I0313 10:41:56.306229 17876 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 13 10:41:56.311156 master-0 kubenswrapper[17876]: I0313 10:41:56.309490 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:56.323463 master-0 kubenswrapper[17876]: I0313 10:41:56.323399 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 10:41:56.332494 master-0 kubenswrapper[17876]: I0313 10:41:56.332456 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:56.342969 master-0 kubenswrapper[17876]: I0313 10:41:56.342916 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 10:41:56.350813 master-0 kubenswrapper[17876]: I0313 10:41:56.350765 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7c5279e3-0165-4347-bfc7-87b80accaab3-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:56.352992 master-0 kubenswrapper[17876]: I0313 10:41:56.352951 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7748068f-7409-4972-81d2-84cfb52b7af0-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:56.362414 master-0 kubenswrapper[17876]: I0313 10:41:56.362355 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2p4lb" Mar 13 10:41:56.383366 master-0 kubenswrapper[17876]: I0313 10:41:56.383248 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 10:41:56.392393 master-0 kubenswrapper[17876]: I0313 10:41:56.392341 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:56.402797 master-0 kubenswrapper[17876]: I0313 10:41:56.402725 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 10:41:56.415120 master-0 kubenswrapper[17876]: I0313 10:41:56.410425 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/7748068f-7409-4972-81d2-84cfb52b7af0-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:56.423706 master-0 kubenswrapper[17876]: I0313 10:41:56.423655 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 10:41:56.442258 master-0 kubenswrapper[17876]: I0313 10:41:56.442187 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 10:41:56.447226 master-0 kubenswrapper[17876]: I0313 10:41:56.447179 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.464048 master-0 kubenswrapper[17876]: I0313 10:41:56.463999 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 10:41:56.470619 master-0 kubenswrapper[17876]: I0313 10:41:56.470583 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.482465 master-0 kubenswrapper[17876]: I0313 10:41:56.482401 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 10:41:56.487834 master-0 kubenswrapper[17876]: I0313 10:41:56.487791 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.511356 master-0 kubenswrapper[17876]: I0313 10:41:56.511301 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 10:41:56.521204 master-0 kubenswrapper[17876]: I0313 10:41:56.521058 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.523791 master-0 kubenswrapper[17876]: I0313 10:41:56.523759 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 10:41:56.525152 master-0 kubenswrapper[17876]: I0313 10:41:56.525122 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.542326 master-0 kubenswrapper[17876]: I0313 10:41:56.542260 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 10:41:56.547573 master-0 kubenswrapper[17876]: I0313 10:41:56.547524 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.562769 master-0 kubenswrapper[17876]: I0313 10:41:56.562721 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 10:41:56.566234 master-0 kubenswrapper[17876]: I0313 10:41:56.566207 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.582456 master-0 kubenswrapper[17876]: I0313 10:41:56.582420 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 10:41:56.588663 master-0 kubenswrapper[17876]: I0313 10:41:56.588610 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595250 17876 secret.go:189] Couldn't get secret openshift-console-operator/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595316 17876 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595335 17876 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595376 17876 configmap.go:193] Couldn't get configMap openshift-image-registry/image-registry-certificates: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595411 17876 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-user-template-error: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595415 17876 configmap.go:193] Couldn't get configMap openshift-console-operator/trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595343 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595319521 +0000 UTC m=+25.431125997 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595457 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert podName:c179be5b-2517-4ae5-9c30-2d4415899123 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595443124 +0000 UTC m=+25.431249600 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert") pod "ingress-canary-p5ncj" (UID: "c179be5b-2517-4ae5-9c30-2d4415899123") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595503 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595467035 +0000 UTC m=+25.431273511 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595523 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca podName:d23e2957-3a22-44f6-937c-5ab6314681c0 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595514886 +0000 UTC m=+25.431321362 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serviceca" (UniqueName: "kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca") pod "node-ca-trztz" (UID: "d23e2957-3a22-44f6-937c-5ab6314681c0") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595537 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595530537 +0000 UTC m=+25.431337013 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-user-template-error" (UniqueName: "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595572 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595544057 +0000 UTC m=+25.431350523 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595596 17876 secret.go:189] Couldn't get secret openshift-monitoring/monitoring-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595627 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert podName:5c65aadf-c6fc-4959-9366-3e9d378bb507 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595618599 +0000 UTC m=+25.431425075 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "monitoring-plugin-cert" (UniqueName: "kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert") pod "monitoring-plugin-6558455fc8-8qww9" (UID: "5c65aadf-c6fc-4959-9366-3e9d378bb507") : failed to sync secret cache: timed out waiting for the condition Mar 13 10:41:56.595781 master-0 kubenswrapper[17876]: E0313 10:41:56.595693 17876 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-service-ca: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.596421 master-0 kubenswrapper[17876]: E0313 10:41:56.595989 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca podName:f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.595951929 +0000 UTC m=+25.431758405 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-service-ca" (UniqueName: "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca") pod "oauth-openshift-5db65d9766-lg686" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.596421 master-0 kubenswrapper[17876]: E0313 10:41:56.595732 17876 configmap.go:193] Couldn't get configMap openshift-console-operator/console-operator-config: failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.596421 master-0 kubenswrapper[17876]: E0313 10:41:56.596032 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config podName:1109b282-3ee4-4c4e-a64a-e6a22adeb6c9 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:57.596022671 +0000 UTC m=+25.431829147 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config") pod "console-operator-6c7fb6b958-rb7nv" (UID: "1109b282-3ee4-4c4e-a64a-e6a22adeb6c9") : failed to sync configmap cache: timed out waiting for the condition Mar 13 10:41:56.602218 master-0 kubenswrapper[17876]: I0313 10:41:56.602159 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 10:41:56.622871 master-0 kubenswrapper[17876]: I0313 10:41:56.622804 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 10:41:56.642995 master-0 kubenswrapper[17876]: I0313 10:41:56.642885 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-kkkpw" Mar 13 10:41:56.662647 master-0 kubenswrapper[17876]: I0313 10:41:56.662590 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 10:41:56.693355 master-0 kubenswrapper[17876]: I0313 10:41:56.693280 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 10:41:56.703640 master-0 kubenswrapper[17876]: I0313 10:41:56.703404 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 10:41:56.722795 master-0 kubenswrapper[17876]: I0313 10:41:56.722721 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 10:41:56.742777 master-0 kubenswrapper[17876]: I0313 10:41:56.742721 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 10:41:56.762978 master-0 kubenswrapper[17876]: I0313 10:41:56.762908 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 10:41:56.790673 master-0 kubenswrapper[17876]: I0313 10:41:56.790616 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 10:41:56.802603 master-0 kubenswrapper[17876]: I0313 10:41:56.802546 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 10:41:56.822394 master-0 kubenswrapper[17876]: I0313 10:41:56.822330 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-hggr7" Mar 13 10:41:56.843122 master-0 kubenswrapper[17876]: I0313 10:41:56.843057 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 10:41:56.861306 master-0 kubenswrapper[17876]: I0313 10:41:56.861249 17876 request.go:700] Waited for 3.02040177s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 13 10:41:56.863163 master-0 kubenswrapper[17876]: I0313 10:41:56.863086 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 10:41:56.902728 master-0 kubenswrapper[17876]: I0313 10:41:56.902590 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-5258b" Mar 13 10:41:56.922527 master-0 kubenswrapper[17876]: I0313 10:41:56.922472 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 10:41:56.943461 master-0 kubenswrapper[17876]: I0313 10:41:56.943393 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 10:41:56.963465 master-0 kubenswrapper[17876]: I0313 10:41:56.963402 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-894vf" Mar 13 10:41:56.999283 master-0 kubenswrapper[17876]: I0313 10:41:56.999229 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p29b\" (UniqueName: \"kubernetes.io/projected/06ecac2e-bffa-474b-a824-9ba4a194159a-kube-api-access-6p29b\") pod \"control-plane-machine-set-operator-6686554ddc-d5flg\" (UID: \"06ecac2e-bffa-474b-a824-9ba4a194159a\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" Mar 13 10:41:57.020372 master-0 kubenswrapper[17876]: I0313 10:41:57.020303 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fswp7\" (UniqueName: \"kubernetes.io/projected/257ae542-4a06-42d3-b3e8-bf0a376494a8-kube-api-access-fswp7\") pod \"certified-operators-kwwkz\" (UID: \"257ae542-4a06-42d3-b3e8-bf0a376494a8\") " pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:57.040578 master-0 kubenswrapper[17876]: I0313 10:41:57.040522 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlmhs\" (UniqueName: \"kubernetes.io/projected/6e69683c-59c5-43da-b105-ef2efb2d0a4e-kube-api-access-wlmhs\") pod \"service-ca-operator-69b6fc6b88-xbqsd\" (UID: \"6e69683c-59c5-43da-b105-ef2efb2d0a4e\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-xbqsd" Mar 13 10:41:57.056251 master-0 kubenswrapper[17876]: I0313 10:41:57.056185 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzxzq\" (UniqueName: \"kubernetes.io/projected/a13f3e08-2b67-404f-8695-77aa17f92137-kube-api-access-bzxzq\") pod \"package-server-manager-854648ff6d-cfp26\" (UID: \"a13f3e08-2b67-404f-8695-77aa17f92137\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:57.080602 master-0 kubenswrapper[17876]: I0313 10:41:57.080546 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vbsc\" (UniqueName: \"kubernetes.io/projected/61427254-6722-4d1a-a96a-dadd24abbe94-kube-api-access-6vbsc\") pod \"machine-config-operator-fdb5c78b5-c65k4\" (UID: \"61427254-6722-4d1a-a96a-dadd24abbe94\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" Mar 13 10:41:57.097181 master-0 kubenswrapper[17876]: I0313 10:41:57.097126 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jk4qr\" (UniqueName: \"kubernetes.io/projected/ec33c506-8abe-4659-84d3-a294c31b446c-kube-api-access-jk4qr\") pod \"operator-controller-controller-manager-6598bfb6c4-22jb5\" (UID: \"ec33c506-8abe-4659-84d3-a294c31b446c\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:57.117952 master-0 kubenswrapper[17876]: I0313 10:41:57.117880 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mnnnp\" (UniqueName: \"kubernetes.io/projected/b57f1c19-f44a-4405-8135-79aef1d1ce07-kube-api-access-mnnnp\") pod \"cluster-storage-operator-6fbfc8dc8f-wz9t2\" (UID: \"b57f1c19-f44a-4405-8135-79aef1d1ce07\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" Mar 13 10:41:57.141149 master-0 kubenswrapper[17876]: I0313 10:41:57.140030 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5gkd\" (UniqueName: \"kubernetes.io/projected/a3a72b45-a705-4335-9c04-c952ec5d9975-kube-api-access-b5gkd\") pod \"tuned-mzx9f\" (UID: \"a3a72b45-a705-4335-9c04-c952ec5d9975\") " pod="openshift-cluster-node-tuning-operator/tuned-mzx9f" Mar 13 10:41:57.156560 master-0 kubenswrapper[17876]: I0313 10:41:57.156167 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmf6l\" (UniqueName: \"kubernetes.io/projected/8dc7af5f-ff72-4f06-88df-a26ff4c0bded-kube-api-access-vmf6l\") pod \"machine-approver-754bdc9f9d-942bv\" (UID: \"8dc7af5f-ff72-4f06-88df-a26ff4c0bded\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" Mar 13 10:41:57.179757 master-0 kubenswrapper[17876]: I0313 10:41:57.179691 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kqq\" (UniqueName: \"kubernetes.io/projected/d9fd7b06-d61d-47c3-a08f-846245c79cc9-kube-api-access-s2kqq\") pod \"cluster-node-tuning-operator-66c7586884-2qml7\" (UID: \"d9fd7b06-d61d-47c3-a08f-846245c79cc9\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-2qml7" Mar 13 10:41:57.194464 master-0 kubenswrapper[17876]: I0313 10:41:57.194406 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f6fm9\" (UniqueName: \"kubernetes.io/projected/8d2fdba3-9478-4165-9207-d01483625607-kube-api-access-f6fm9\") pod \"network-operator-7c649bf6d4-z9wrg\" (UID: \"8d2fdba3-9478-4165-9207-d01483625607\") " pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" Mar 13 10:41:57.216063 master-0 kubenswrapper[17876]: I0313 10:41:57.216011 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8z5fj\" (UniqueName: \"kubernetes.io/projected/94f7921a-6d0f-45b7-ba8f-9f2ef74b044e-kube-api-access-8z5fj\") pod \"router-default-79f8cd6fdd-mbkch\" (UID: \"94f7921a-6d0f-45b7-ba8f-9f2ef74b044e\") " pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:57.238074 master-0 kubenswrapper[17876]: I0313 10:41:57.238014 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4kn26\" (UniqueName: \"kubernetes.io/projected/8b07c5ae-1149-4031-bd92-6df4331e586c-kube-api-access-4kn26\") pod \"community-operators-lhqzl\" (UID: \"8b07c5ae-1149-4031-bd92-6df4331e586c\") " pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:57.254477 master-0 kubenswrapper[17876]: I0313 10:41:57.254418 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn5nv\" (UniqueName: \"kubernetes.io/projected/58685de6-b4ae-4229-870b-5143a6010450-kube-api-access-kn5nv\") pod \"iptables-alerter-55t7x\" (UID: \"58685de6-b4ae-4229-870b-5143a6010450\") " pod="openshift-network-operator/iptables-alerter-55t7x" Mar 13 10:41:57.273891 master-0 kubenswrapper[17876]: I0313 10:41:57.273823 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-bound-sa-token\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:57.296200 master-0 kubenswrapper[17876]: I0313 10:41:57.296133 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tzdf2\" (UniqueName: \"kubernetes.io/projected/f8c7f667-d30e-41f4-8c0e-f3f138bffab4-kube-api-access-tzdf2\") pod \"cluster-olm-operator-77899cf6d-xvxcr\" (UID: \"f8c7f667-d30e-41f4-8c0e-f3f138bffab4\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" Mar 13 10:41:57.314989 master-0 kubenswrapper[17876]: I0313 10:41:57.314921 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-892f7\" (UniqueName: \"kubernetes.io/projected/03b97fde-467c-46f0-95f9-9c3820b4d790-kube-api-access-892f7\") pod \"catalog-operator-7d9c49f57b-tw9nm\" (UID: \"03b97fde-467c-46f0-95f9-9c3820b4d790\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:57.338475 master-0 kubenswrapper[17876]: I0313 10:41:57.338406 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6nfl8\" (UniqueName: \"kubernetes.io/projected/cf740515-d70d-44b6-ac00-21143b5494d1-kube-api-access-6nfl8\") pod \"ingress-operator-677db989d6-b2ss8\" (UID: \"cf740515-d70d-44b6-ac00-21143b5494d1\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-b2ss8" Mar 13 10:41:57.357709 master-0 kubenswrapper[17876]: I0313 10:41:57.357643 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqf9z\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-kube-api-access-hqf9z\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:57.373462 master-0 kubenswrapper[17876]: I0313 10:41:57.373399 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d72bw\" (UniqueName: \"kubernetes.io/projected/3f872e59-1de1-4a95-8064-79696c73e8ab-kube-api-access-d72bw\") pod \"openshift-config-operator-64488f9d78-pchtd\" (UID: \"3f872e59-1de1-4a95-8064-79696c73e8ab\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:57.393424 master-0 kubenswrapper[17876]: I0313 10:41:57.393371 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6lnq\" (UniqueName: \"kubernetes.io/projected/018c9219-d314-4408-ac39-93475d87eefb-kube-api-access-v6lnq\") pod \"apiserver-576d4447f8-zqphk\" (UID: \"018c9219-d314-4408-ac39-93475d87eefb\") " pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:57.416832 master-0 kubenswrapper[17876]: I0313 10:41:57.416624 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9c92k\" (UniqueName: \"kubernetes.io/projected/ecb5bdcc-647d-4292-a33d-dc3df331c206-kube-api-access-9c92k\") pod \"authentication-operator-7c6989d6c4-8kd6c\" (UID: \"ecb5bdcc-647d-4292-a33d-dc3df331c206\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" Mar 13 10:41:57.435244 master-0 kubenswrapper[17876]: I0313 10:41:57.435195 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gntlk\" (UniqueName: \"kubernetes.io/projected/2157cb66-d458-4353-bc9c-ef761e61e5c5-kube-api-access-gntlk\") pod \"redhat-operators-kqrsd\" (UID: \"2157cb66-d458-4353-bc9c-ef761e61e5c5\") " pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:57.453475 master-0 kubenswrapper[17876]: I0313 10:41:57.453409 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zltcf\" (UniqueName: \"kubernetes.io/projected/a7b698d2-f23a-4404-bc63-757ca549356f-kube-api-access-zltcf\") pod \"network-check-target-jwfjl\" (UID: \"a7b698d2-f23a-4404-bc63-757ca549356f\") " pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:41:57.475852 master-0 kubenswrapper[17876]: I0313 10:41:57.475795 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/25332da9-099c-4190-9e24-c19c86830a54-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-cchhs\" (UID: \"25332da9-099c-4190-9e24-c19c86830a54\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-cchhs" Mar 13 10:41:57.494861 master-0 kubenswrapper[17876]: I0313 10:41:57.494796 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wrq5t\" (UniqueName: \"kubernetes.io/projected/e87ca16c-25de-4fea-b900-2960f4a5f95e-kube-api-access-wrq5t\") pod \"csi-snapshot-controller-operator-5685fbc7d-pn89z\" (UID: \"e87ca16c-25de-4fea-b900-2960f4a5f95e\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" Mar 13 10:41:57.519127 master-0 kubenswrapper[17876]: I0313 10:41:57.519031 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mjm7\" (UniqueName: \"kubernetes.io/projected/fd91626c-38a8-462f-8bc0-96d57532de87-kube-api-access-7mjm7\") pod \"migrator-57ccdf9b5-k9n8l\" (UID: \"fd91626c-38a8-462f-8bc0-96d57532de87\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-k9n8l" Mar 13 10:41:57.534538 master-0 kubenswrapper[17876]: I0313 10:41:57.534473 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvl4j\" (UniqueName: \"kubernetes.io/projected/b02805e2-f186-4e59-bdfa-f4793263b468-kube-api-access-cvl4j\") pod \"cloud-credential-operator-55d85b7b47-qbgcg\" (UID: \"b02805e2-f186-4e59-bdfa-f4793263b468\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-qbgcg" Mar 13 10:41:57.554921 master-0 kubenswrapper[17876]: I0313 10:41:57.554857 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fcqg\" (UniqueName: \"kubernetes.io/projected/2563ecb2-5783-4c45-a7f6-180e14e1c8c4-kube-api-access-4fcqg\") pod \"cluster-samples-operator-664cb58b85-82x6j\" (UID: \"2563ecb2-5783-4c45-a7f6-180e14e1c8c4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-82x6j" Mar 13 10:41:57.574284 master-0 kubenswrapper[17876]: I0313 10:41:57.574201 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffs2h\" (UniqueName: \"kubernetes.io/projected/024d9bd3-ac77-4257-9808-7518f2a73e11-kube-api-access-ffs2h\") pod \"olm-operator-d64cfc9db-h46sf\" (UID: \"024d9bd3-ac77-4257-9808-7518f2a73e11\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:57.595145 master-0 kubenswrapper[17876]: I0313 10:41:57.595065 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr9x5\" (UniqueName: \"kubernetes.io/projected/b460735c-56aa-4dd3-a756-759859083e12-kube-api-access-qr9x5\") pod \"network-check-source-7c67b67d47-zxjfv\" (UID: \"b460735c-56aa-4dd3-a756-759859083e12\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-zxjfv" Mar 13 10:41:57.618987 master-0 kubenswrapper[17876]: I0313 10:41:57.618914 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"route-controller-manager-9f8f9b5c9-pjljl\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:57.633902 master-0 kubenswrapper[17876]: I0313 10:41:57.633842 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.634155 master-0 kubenswrapper[17876]: I0313 10:41:57.633912 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert\") pod \"monitoring-plugin-6558455fc8-8qww9\" (UID: \"5c65aadf-c6fc-4959-9366-3e9d378bb507\") " pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:57.634232 master-0 kubenswrapper[17876]: I0313 10:41:57.634156 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.634695 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.634749 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.634765 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.634970 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.635045 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.635262 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.635299 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.635893 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d23e2957-3a22-44f6-937c-5ab6314681c0-serviceca\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.636233 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-config\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.636289 master-0 kubenswrapper[17876]: I0313 10:41:57.636255 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-trusted-ca\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.636676 master-0 kubenswrapper[17876]: I0313 10:41:57.636444 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.637426 master-0 kubenswrapper[17876]: I0313 10:41:57.637393 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5c65aadf-c6fc-4959-9366-3e9d378bb507-monitoring-plugin-cert\") pod \"monitoring-plugin-6558455fc8-8qww9\" (UID: \"5c65aadf-c6fc-4959-9366-3e9d378bb507\") " pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:57.638731 master-0 kubenswrapper[17876]: I0313 10:41:57.638698 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:57.639035 master-0 kubenswrapper[17876]: I0313 10:41:57.638996 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c179be5b-2517-4ae5-9c30-2d4415899123-cert\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:57.639386 master-0 kubenswrapper[17876]: I0313 10:41:57.639200 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-serving-cert\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:57.640216 master-0 kubenswrapper[17876]: I0313 10:41:57.640182 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9l88\" (UniqueName: \"kubernetes.io/projected/a3c91eef-ec46-419f-b418-ac3a8094b77d-kube-api-access-b9l88\") pod \"network-node-identity-hkjrg\" (UID: \"a3c91eef-ec46-419f-b418-ac3a8094b77d\") " pod="openshift-network-node-identity/network-node-identity-hkjrg" Mar 13 10:41:57.658490 master-0 kubenswrapper[17876]: I0313 10:41:57.658447 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscfc\" (UniqueName: \"kubernetes.io/projected/fb060653-0d4b-4759-a7a1-c5dce194cce7-kube-api-access-zscfc\") pod \"ovnkube-node-vww4t\" (UID: \"fb060653-0d4b-4759-a7a1-c5dce194cce7\") " pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:57.683645 master-0 kubenswrapper[17876]: I0313 10:41:57.683503 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjxp2\" (UniqueName: \"kubernetes.io/projected/e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc-kube-api-access-zjxp2\") pod \"multus-bjv5r\" (UID: \"e970692c-7d5b-4ab1-bc7e-2c2e98f3f6dc\") " pod="openshift-multus/multus-bjv5r" Mar 13 10:41:57.704955 master-0 kubenswrapper[17876]: I0313 10:41:57.704899 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/893dac15-d6d4-4a1f-988c-59aaf9e63334-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-ml9xh\" (UID: \"893dac15-d6d4-4a1f-988c-59aaf9e63334\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" Mar 13 10:41:57.716168 master-0 kubenswrapper[17876]: I0313 10:41:57.716108 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htb49\" (UniqueName: \"kubernetes.io/projected/2c3e94d4-5c6d-4092-975c-e5bca49eb397-kube-api-access-htb49\") pod \"service-ca-84bfdbbb7f-xldln\" (UID: \"2c3e94d4-5c6d-4092-975c-e5bca49eb397\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-xldln" Mar 13 10:41:57.742123 master-0 kubenswrapper[17876]: I0313 10:41:57.742061 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrmcp\" (UniqueName: \"kubernetes.io/projected/a0917212-59d8-4799-a9bc-52e358c5e8a0-kube-api-access-lrmcp\") pod \"machine-api-operator-84bf6db4f9-svqcp\" (UID: \"a0917212-59d8-4799-a9bc-52e358c5e8a0\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" Mar 13 10:41:57.753407 master-0 kubenswrapper[17876]: I0313 10:41:57.753361 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd99t\" (UniqueName: \"kubernetes.io/projected/0932314b-ccf5-4be5-99f8-b99886392daa-kube-api-access-kd99t\") pod \"etcd-operator-5884b9cd56-t2xfz\" (UID: \"0932314b-ccf5-4be5-99f8-b99886392daa\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-t2xfz" Mar 13 10:41:57.779264 master-0 kubenswrapper[17876]: I0313 10:41:57.779228 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjkdx\" (UniqueName: \"kubernetes.io/projected/0881de70-2db3-4fc2-b976-b55c11dc239d-kube-api-access-vjkdx\") pod \"cluster-baremetal-operator-5cdb4c5598-2c4sl\" (UID: \"0881de70-2db3-4fc2-b976-b55c11dc239d\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" Mar 13 10:41:57.794253 master-0 kubenswrapper[17876]: I0313 10:41:57.794214 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rqms\" (UniqueName: \"kubernetes.io/projected/9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e-kube-api-access-5rqms\") pod \"machine-config-controller-ff46b7bdf-zx8pp\" (UID: \"9bbcde8d-4c56-4ef7-9fe5-f0ceebb1e65e\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-zx8pp" Mar 13 10:41:57.816888 master-0 kubenswrapper[17876]: I0313 10:41:57.816849 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5p4cf\" (UniqueName: \"kubernetes.io/projected/cc66541c-6410-4824-b173-53747069429e-kube-api-access-5p4cf\") pod \"multus-additional-cni-plugins-72t2n\" (UID: \"cc66541c-6410-4824-b173-53747069429e\") " pod="openshift-multus/multus-additional-cni-plugins-72t2n" Mar 13 10:41:57.833354 master-0 kubenswrapper[17876]: I0313 10:41:57.833259 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr5lp\" (UniqueName: \"kubernetes.io/projected/9ca1b7c7-41af-46e9-8f5d-a476ee2b7587-kube-api-access-qr5lp\") pod \"multus-admission-controller-7769569c45-6lqz5\" (UID: \"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587\") " pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" Mar 13 10:41:57.854720 master-0 kubenswrapper[17876]: I0313 10:41:57.854658 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpnm8\" (UniqueName: \"kubernetes.io/projected/1f358d81-87c6-40bf-89e8-5681429285f8-kube-api-access-rpnm8\") pod \"openshift-controller-manager-operator-8565d84698-4kpg8\" (UID: \"1f358d81-87c6-40bf-89e8-5681429285f8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" Mar 13 10:41:57.874471 master-0 kubenswrapper[17876]: I0313 10:41:57.874367 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:41:57.874471 master-0 kubenswrapper[17876]: I0313 10:41:57.874436 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9cxp\" (UniqueName: \"kubernetes.io/projected/b7090328-1191-4c7c-afed-603d7333014f-kube-api-access-v9cxp\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n\" (UID: \"b7090328-1191-4c7c-afed-603d7333014f\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" Mar 13 10:41:57.881621 master-0 kubenswrapper[17876]: I0313 10:41:57.881565 17876 request.go:700] Waited for 3.94048573s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-oauth-apiserver/serviceaccounts/oauth-apiserver-sa/token Mar 13 10:41:57.896619 master-0 kubenswrapper[17876]: I0313 10:41:57.896556 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nr6p\" (UniqueName: \"kubernetes.io/projected/a9258b0f-fdcc-4bfa-b982-5cf3c899c432-kube-api-access-4nr6p\") pod \"apiserver-999d99f5f-hlk52\" (UID: \"a9258b0f-fdcc-4bfa-b982-5cf3c899c432\") " pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:57.938279 master-0 kubenswrapper[17876]: I0313 10:41:57.935740 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9768\" (UniqueName: \"kubernetes.io/projected/e7d31378-e940-4473-ab37-10f250c76666-kube-api-access-b9768\") pod \"dns-operator-589895fbb7-6zkqh\" (UID: \"e7d31378-e940-4473-ab37-10f250c76666\") " pod="openshift-dns-operator/dns-operator-589895fbb7-6zkqh" Mar 13 10:41:57.941246 master-0 kubenswrapper[17876]: I0313 10:41:57.938797 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5v4b\" (UniqueName: \"kubernetes.io/projected/84f78350-e85c-4377-97cd-9e9a1b2ff4ee-kube-api-access-d5v4b\") pod \"csi-snapshot-controller-7577d6f48-kcw4k\" (UID: \"84f78350-e85c-4377-97cd-9e9a1b2ff4ee\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" Mar 13 10:41:57.961513 master-0 kubenswrapper[17876]: I0313 10:41:57.961468 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nqrh5\" (UniqueName: \"kubernetes.io/projected/db9faadf-74e9-4a7f-b3a6-902dd14ac978-kube-api-access-nqrh5\") pod \"catalogd-controller-manager-7f8b8b6f4c-657wt\" (UID: \"db9faadf-74e9-4a7f-b3a6-902dd14ac978\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:57.976565 master-0 kubenswrapper[17876]: I0313 10:41:57.976504 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b5ed7aff-47c0-42f3-9a26-9385d2bde582-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-zg9h2\" (UID: \"b5ed7aff-47c0-42f3-9a26-9385d2bde582\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-zg9h2" Mar 13 10:41:57.995969 master-0 kubenswrapper[17876]: I0313 10:41:57.995907 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-frmjp\" (UniqueName: \"kubernetes.io/projected/2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf-kube-api-access-frmjp\") pod \"openshift-apiserver-operator-799b6db4d7-ndnmq\" (UID: \"2619d0e9-73d1-4ad3-a8b1-b9b37ecf84bf\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-ndnmq" Mar 13 10:41:58.019600 master-0 kubenswrapper[17876]: I0313 10:41:58.019460 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/53da2840-4a92-497a-a9d3-973583887147-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-z7h4j\" (UID: \"53da2840-4a92-497a-a9d3-973583887147\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-z7h4j" Mar 13 10:41:58.042986 master-0 kubenswrapper[17876]: I0313 10:41:58.042897 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"controller-manager-79847c4f97-tf57f\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:58.054468 master-0 kubenswrapper[17876]: I0313 10:41:58.054421 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8dpd\" (UniqueName: \"kubernetes.io/projected/9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906-kube-api-access-g8dpd\") pod \"insights-operator-8f89dfddd-v9x5b\" (UID: \"9b2be0aa-ec8f-4d4e-8b61-e028a4f9a906\") " pod="openshift-insights/insights-operator-8f89dfddd-v9x5b" Mar 13 10:41:58.073305 master-0 kubenswrapper[17876]: I0313 10:41:58.073244 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws7gk\" (UniqueName: \"kubernetes.io/projected/7748068f-7409-4972-81d2-84cfb52b7af0-kube-api-access-ws7gk\") pod \"prometheus-operator-5ff8674d55-zpqlc\" (UID: \"7748068f-7409-4972-81d2-84cfb52b7af0\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-zpqlc" Mar 13 10:41:58.095218 master-0 kubenswrapper[17876]: I0313 10:41:58.095145 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6smf\" (UniqueName: \"kubernetes.io/projected/161beda5-f575-4e60-8baa-5262a4fe86c7-kube-api-access-q6smf\") pod \"machine-config-server-zkmjs\" (UID: \"161beda5-f575-4e60-8baa-5262a4fe86c7\") " pod="openshift-machine-config-operator/machine-config-server-zkmjs" Mar 13 10:41:58.115338 master-0 kubenswrapper[17876]: I0313 10:41:58.115203 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkwfv\" (UniqueName: \"kubernetes.io/projected/1ef32245-c238-43c6-a57a-a5ac95aff1f7-kube-api-access-xkwfv\") pod \"marketplace-operator-64bf9778cb-4v99n\" (UID: \"1ef32245-c238-43c6-a57a-a5ac95aff1f7\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:58.134525 master-0 kubenswrapper[17876]: I0313 10:41:58.134412 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"multus-admission-controller-8d675b596-6gzxr\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " pod="openshift-multus/multus-admission-controller-8d675b596-6gzxr" Mar 13 10:41:58.144880 master-0 kubenswrapper[17876]: I0313 10:41:58.144814 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") pod \"5da919b6-8545-4001-89f3-74cb289327f0\" (UID: \"5da919b6-8545-4001-89f3-74cb289327f0\") " Mar 13 10:41:58.148451 master-0 kubenswrapper[17876]: I0313 10:41:58.148383 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj" (OuterVolumeSpecName: "kube-api-access-twcrj") pod "5da919b6-8545-4001-89f3-74cb289327f0" (UID: "5da919b6-8545-4001-89f3-74cb289327f0"). InnerVolumeSpecName "kube-api-access-twcrj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:58.155514 master-0 kubenswrapper[17876]: I0313 10:41:58.155466 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bn7vq\" (UniqueName: \"kubernetes.io/projected/f99b999c-4213-4d29-ab14-26c584e88445-kube-api-access-bn7vq\") pod \"redhat-marketplace-dnhzw\" (UID: \"f99b999c-4213-4d29-ab14-26c584e88445\") " pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:58.178903 master-0 kubenswrapper[17876]: I0313 10:41:58.178856 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7cf7b1dc-96ab-41ef-871c-9ed5ce2db584-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-cvqxk\" (UID: \"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" Mar 13 10:41:58.193603 master-0 kubenswrapper[17876]: I0313 10:41:58.193421 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwfd8\" (UniqueName: \"kubernetes.io/projected/ba3e43ba-2840-4612-a370-87ad3c5a382a-kube-api-access-hwfd8\") pod \"kube-storage-version-migrator-operator-7f65c457f5-kxmt9\" (UID: \"ba3e43ba-2840-4612-a370-87ad3c5a382a\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" Mar 13 10:41:58.213642 master-0 kubenswrapper[17876]: I0313 10:41:58.213570 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74lr7\" (UniqueName: \"kubernetes.io/projected/8df2728b-4f21-4aef-b31f-4197bbcd2728-kube-api-access-74lr7\") pod \"network-metrics-daemon-c5vhc\" (UID: \"8df2728b-4f21-4aef-b31f-4197bbcd2728\") " pod="openshift-multus/network-metrics-daemon-c5vhc" Mar 13 10:41:58.233393 master-0 kubenswrapper[17876]: I0313 10:41:58.233334 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"cni-sysctl-allowlist-ds-7z94w\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " pod="openshift-multus/cni-sysctl-allowlist-ds-7z94w" Mar 13 10:41:58.246758 master-0 kubenswrapper[17876]: I0313 10:41:58.246691 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") pod \"277614e8-838f-4773-bcfc-89f19c620dee\" (UID: \"277614e8-838f-4773-bcfc-89f19c620dee\") " Mar 13 10:41:58.247312 master-0 kubenswrapper[17876]: I0313 10:41:58.247280 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-twcrj\" (UniqueName: \"kubernetes.io/projected/5da919b6-8545-4001-89f3-74cb289327f0-kube-api-access-twcrj\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:58.249823 master-0 kubenswrapper[17876]: I0313 10:41:58.249777 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz" (OuterVolumeSpecName: "kube-api-access-jzvxz") pod "277614e8-838f-4773-bcfc-89f19c620dee" (UID: "277614e8-838f-4773-bcfc-89f19c620dee"). InnerVolumeSpecName "kube-api-access-jzvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:41:58.254784 master-0 kubenswrapper[17876]: I0313 10:41:58.254746 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m5x2b\" (UniqueName: \"kubernetes.io/projected/0529b217-a9ef-48fb-b40a-b6789c640c20-kube-api-access-m5x2b\") pod \"machine-config-daemon-j9twr\" (UID: \"0529b217-a9ef-48fb-b40a-b6789c640c20\") " pod="openshift-machine-config-operator/machine-config-daemon-j9twr" Mar 13 10:41:58.277741 master-0 kubenswrapper[17876]: I0313 10:41:58.277449 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffmmr\" (UniqueName: \"kubernetes.io/projected/97328e01-1227-417e-9af7-6426495d96db-kube-api-access-ffmmr\") pod \"packageserver-85b658d7fb-45fq6\" (UID: \"97328e01-1227-417e-9af7-6426495d96db\") " pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:58.290196 master-0 kubenswrapper[17876]: W0313 10:41:58.290073 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c65aadf_c6fc_4959_9366_3e9d378bb507.slice/crio-c8233ad861b72b9697a6e4a4a22f2aef41687bad319280d350c972b96db1ede9 WatchSource:0}: Error finding container c8233ad861b72b9697a6e4a4a22f2aef41687bad319280d350c972b96db1ede9: Status 404 returned error can't find the container with id c8233ad861b72b9697a6e4a4a22f2aef41687bad319280d350c972b96db1ede9 Mar 13 10:41:58.293166 master-0 kubenswrapper[17876]: I0313 10:41:58.293081 17876 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:41:58.295466 master-0 kubenswrapper[17876]: I0313 10:41:58.295365 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvmjs\" (UniqueName: \"kubernetes.io/projected/193b3b95-f9a3-4272-853b-86366ce348a2-kube-api-access-fvmjs\") pod \"ovnkube-control-plane-66b55d57d-ns7z7\" (UID: \"193b3b95-f9a3-4272-853b-86366ce348a2\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" Mar 13 10:41:58.320447 master-0 kubenswrapper[17876]: I0313 10:41:58.320386 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ch8qd\" (UniqueName: \"kubernetes.io/projected/17b956d3-c046-4f26-8be2-718c165a3acc-kube-api-access-ch8qd\") pod \"cluster-monitoring-operator-674cbfbd9d-7rcdn\" (UID: \"17b956d3-c046-4f26-8be2-718c165a3acc\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-7rcdn" Mar 13 10:41:58.338245 master-0 kubenswrapper[17876]: I0313 10:41:58.338149 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chxxr\" (UniqueName: \"kubernetes.io/projected/e4b55ebf-cab8-4985-95cc-b28bc5ae0578-kube-api-access-chxxr\") pod \"cluster-autoscaler-operator-69576476f7-p7qlt\" (UID: \"e4b55ebf-cab8-4985-95cc-b28bc5ae0578\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" Mar 13 10:41:58.355338 master-0 kubenswrapper[17876]: I0313 10:41:58.355271 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gcf6\" (UniqueName: \"kubernetes.io/projected/3e15f776-d153-4289-91c7-893584104185-kube-api-access-2gcf6\") pod \"dns-default-qt95m\" (UID: \"3e15f776-d153-4289-91c7-893584104185\") " pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:58.358085 master-0 kubenswrapper[17876]: I0313 10:41:58.357874 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzvxz\" (UniqueName: \"kubernetes.io/projected/277614e8-838f-4773-bcfc-89f19c620dee-kube-api-access-jzvxz\") on node \"master-0\" DevicePath \"\"" Mar 13 10:41:58.374544 master-0 kubenswrapper[17876]: I0313 10:41:58.374484 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppjzw\" (UniqueName: \"kubernetes.io/projected/9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c-kube-api-access-ppjzw\") pod \"node-resolver-d542b\" (UID: \"9c5a0e51-9e3c-4b3f-b3bb-c76c0326833c\") " pod="openshift-dns/node-resolver-d542b" Mar 13 10:41:58.399636 master-0 kubenswrapper[17876]: E0313 10:41:58.399574 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.399921 master-0 kubenswrapper[17876]: E0313 10:41:58.399896 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.400063 master-0 kubenswrapper[17876]: E0313 10:41:58.400050 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:58.900029232 +0000 UTC m=+26.735835708 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.418961 master-0 kubenswrapper[17876]: I0313 10:41:58.418879 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hfjxf\" (UniqueName: \"kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf\") pod \"oauth-openshift-5db65d9766-lg686\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:58.423745 master-0 kubenswrapper[17876]: I0313 10:41:58.423682 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:41:58.438534 master-0 kubenswrapper[17876]: I0313 10:41:58.438475 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6hq5\" (UniqueName: \"kubernetes.io/projected/1109b282-3ee4-4c4e-a64a-e6a22adeb6c9-kube-api-access-t6hq5\") pod \"console-operator-6c7fb6b958-rb7nv\" (UID: \"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9\") " pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:58.462759 master-0 kubenswrapper[17876]: I0313 10:41:58.462702 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtq5b\" (UniqueName: \"kubernetes.io/projected/c179be5b-2517-4ae5-9c30-2d4415899123-kube-api-access-jtq5b\") pod \"ingress-canary-p5ncj\" (UID: \"c179be5b-2517-4ae5-9c30-2d4415899123\") " pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:58.478548 master-0 kubenswrapper[17876]: I0313 10:41:58.478420 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-txw7p\" (UniqueName: \"kubernetes.io/projected/7c5279e3-0165-4347-bfc7-87b80accaab3-kube-api-access-txw7p\") pod \"kube-state-metrics-68b88f8cb5-dw9w6\" (UID: \"7c5279e3-0165-4347-bfc7-87b80accaab3\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:58.497853 master-0 kubenswrapper[17876]: I0313 10:41:58.497798 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfxbl\" (UniqueName: \"kubernetes.io/projected/d23e2957-3a22-44f6-937c-5ab6314681c0-kube-api-access-mfxbl\") pod \"node-ca-trztz\" (UID: \"d23e2957-3a22-44f6-937c-5ab6314681c0\") " pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: E0313 10:41:58.543691 17876 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="4.051s" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543743 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6558455fc8-8qww9"] Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543773 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543820 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543863 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-cfp26" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543878 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543891 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543907 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543918 17876 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="fc60183e-eaac-41ef-a9ef-6ba30d1fb673" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543936 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543951 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543962 17876 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="fc60183e-eaac-41ef-a9ef-6ba30d1fb673" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.543974 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:58.553997 master-0 kubenswrapper[17876]: I0313 10:41:58.544009 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:58.633824 master-0 kubenswrapper[17876]: I0313 10:41:58.633539 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" Mar 13 10:41:58.658970 master-0 kubenswrapper[17876]: I0313 10:41:58.658917 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:58.658970 master-0 kubenswrapper[17876]: I0313 10:41:58.658972 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:41:58.659488 master-0 kubenswrapper[17876]: I0313 10:41:58.659029 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659568 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659709 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659758 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659839 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659918 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.659954 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-tw9nm" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.660040 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.660058 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.660075 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:58.660180 master-0 kubenswrapper[17876]: I0313 10:41:58.660136 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-mbkch" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660258 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660281 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660321 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660388 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660469 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660591 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660694 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660772 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-jwfjl" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660825 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-h46sf" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660841 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660857 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:58.660929 master-0 kubenswrapper[17876]: I0313 10:41:58.660937 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:58.666809 master-0 kubenswrapper[17876]: I0313 10:41:58.661020 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:41:58.666809 master-0 kubenswrapper[17876]: I0313 10:41:58.661035 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:58.666809 master-0 kubenswrapper[17876]: I0313 10:41:58.661049 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:58.666809 master-0 kubenswrapper[17876]: I0313 10:41:58.661084 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:58.666809 master-0 kubenswrapper[17876]: I0313 10:41:58.661148 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:58.689677 master-0 kubenswrapper[17876]: I0313 10:41:58.668322 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:58.689906 master-0 kubenswrapper[17876]: I0313 10:41:58.689803 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:58.689906 master-0 kubenswrapper[17876]: I0313 10:41:58.689864 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:58.689969 master-0 kubenswrapper[17876]: I0313 10:41:58.689921 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:41:58.689969 master-0 kubenswrapper[17876]: I0313 10:41:58.689954 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:41:58.690024 master-0 kubenswrapper[17876]: I0313 10:41:58.689980 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:41:58.690024 master-0 kubenswrapper[17876]: I0313 10:41:58.689998 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:41:58.690024 master-0 kubenswrapper[17876]: I0313 10:41:58.690019 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:41:58.690124 master-0 kubenswrapper[17876]: I0313 10:41:58.690039 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-qt95m" Mar 13 10:41:58.690124 master-0 kubenswrapper[17876]: I0313 10:41:58.690063 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-85b658d7fb-45fq6" Mar 13 10:41:58.690182 master-0 kubenswrapper[17876]: I0313 10:41:58.690143 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:41:58.715486 master-0 kubenswrapper[17876]: I0313 10:41:58.715371 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:41:58.738627 master-0 kubenswrapper[17876]: I0313 10:41:58.738561 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:41:58.766008 master-0 kubenswrapper[17876]: I0313 10:41:58.765937 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-p5ncj" Mar 13 10:41:58.791926 master-0 kubenswrapper[17876]: I0313 10:41:58.791192 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-trztz" Mar 13 10:41:58.979598 master-0 kubenswrapper[17876]: I0313 10:41:58.979553 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:41:58.980304 master-0 kubenswrapper[17876]: I0313 10:41:58.980248 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:41:58.980559 master-0 kubenswrapper[17876]: E0313 10:41:58.980523 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.980615 master-0 kubenswrapper[17876]: E0313 10:41:58.980560 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.980615 master-0 kubenswrapper[17876]: E0313 10:41:58.980611 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:41:59.980593916 +0000 UTC m=+27.816400392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:41:58.989035 master-0 kubenswrapper[17876]: W0313 10:41:58.988906 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf6490f9f_b03e_4b2f_a8e9_56a9cfdb18f9.slice/crio-8ac13dfa7991399d25aa89c8b83f457b47e56c78ad30fea41fae3c6f40d2a64e WatchSource:0}: Error finding container 8ac13dfa7991399d25aa89c8b83f457b47e56c78ad30fea41fae3c6f40d2a64e: Status 404 returned error can't find the container with id 8ac13dfa7991399d25aa89c8b83f457b47e56c78ad30fea41fae3c6f40d2a64e Mar 13 10:41:59.120848 master-0 kubenswrapper[17876]: I0313 10:41:59.120778 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6"] Mar 13 10:41:59.129849 master-0 kubenswrapper[17876]: W0313 10:41:59.129774 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7c5279e3_0165_4347_bfc7_87b80accaab3.slice/crio-910aad7c1e0a5f0158b5f66bdf8674f2d0713e0381256d26e807c09fe7cff4f3 WatchSource:0}: Error finding container 910aad7c1e0a5f0158b5f66bdf8674f2d0713e0381256d26e807c09fe7cff4f3: Status 404 returned error can't find the container with id 910aad7c1e0a5f0158b5f66bdf8674f2d0713e0381256d26e807c09fe7cff4f3 Mar 13 10:41:59.145726 master-0 kubenswrapper[17876]: I0313 10:41:59.145631 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:41:59.220410 master-0 kubenswrapper[17876]: I0313 10:41:59.220337 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-rb7nv"] Mar 13 10:41:59.231116 master-0 kubenswrapper[17876]: W0313 10:41:59.231038 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1109b282_3ee4_4c4e_a64a_e6a22adeb6c9.slice/crio-af0e5096ec578c1b618a606f70219dd1c90375bc3157658061e9e5c0569ee689 WatchSource:0}: Error finding container af0e5096ec578c1b618a606f70219dd1c90375bc3157658061e9e5c0569ee689: Status 404 returned error can't find the container with id af0e5096ec578c1b618a606f70219dd1c90375bc3157658061e9e5c0569ee689 Mar 13 10:41:59.300249 master-0 kubenswrapper[17876]: I0313 10:41:59.300190 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" event={"ID":"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9","Type":"ContainerStarted","Data":"8ac13dfa7991399d25aa89c8b83f457b47e56c78ad30fea41fae3c6f40d2a64e"} Mar 13 10:41:59.301522 master-0 kubenswrapper[17876]: I0313 10:41:59.301468 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" event={"ID":"5c65aadf-c6fc-4959-9366-3e9d378bb507","Type":"ContainerStarted","Data":"c8233ad861b72b9697a6e4a4a22f2aef41687bad319280d350c972b96db1ede9"} Mar 13 10:41:59.303503 master-0 kubenswrapper[17876]: I0313 10:41:59.303473 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-trztz" event={"ID":"d23e2957-3a22-44f6-937c-5ab6314681c0","Type":"ContainerStarted","Data":"12345304f40af9dfd3068e173a807226c5da951c235dbd20eb4d685a83dab27b"} Mar 13 10:41:59.304878 master-0 kubenswrapper[17876]: I0313 10:41:59.304808 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" event={"ID":"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9","Type":"ContainerStarted","Data":"af0e5096ec578c1b618a606f70219dd1c90375bc3157658061e9e5c0569ee689"} Mar 13 10:41:59.306448 master-0 kubenswrapper[17876]: I0313 10:41:59.306407 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" event={"ID":"7c5279e3-0165-4347-bfc7-87b80accaab3","Type":"ContainerStarted","Data":"910aad7c1e0a5f0158b5f66bdf8674f2d0713e0381256d26e807c09fe7cff4f3"} Mar 13 10:41:59.306868 master-0 kubenswrapper[17876]: I0313 10:41:59.306794 17876 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:41:59.311799 master-0 kubenswrapper[17876]: I0313 10:41:59.311703 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-p5ncj"] Mar 13 10:41:59.323734 master-0 kubenswrapper[17876]: W0313 10:41:59.323695 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc179be5b_2517_4ae5_9c30_2d4415899123.slice/crio-6df90e25590ccd1a19699799deb5333dcec70d6e9f34db406753f93eb87eca11 WatchSource:0}: Error finding container 6df90e25590ccd1a19699799deb5333dcec70d6e9f34db406753f93eb87eca11: Status 404 returned error can't find the container with id 6df90e25590ccd1a19699799deb5333dcec70d6e9f34db406753f93eb87eca11 Mar 13 10:41:59.486490 master-0 kubenswrapper[17876]: I0313 10:41:59.486242 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=31.486166781 podStartE2EDuration="31.486166781s" podCreationTimestamp="2026-03-13 10:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:59.482763661 +0000 UTC m=+27.318570167" watchObservedRunningTime="2026-03-13 10:41:59.486166781 +0000 UTC m=+27.321973277" Mar 13 10:41:59.816589 master-0 kubenswrapper[17876]: I0313 10:41:59.811982 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=31.811950565 podStartE2EDuration="31.811950565s" podCreationTimestamp="2026-03-13 10:41:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:41:59.811707927 +0000 UTC m=+27.647514403" watchObservedRunningTime="2026-03-13 10:41:59.811950565 +0000 UTC m=+27.647757041" Mar 13 10:42:00.009065 master-0 kubenswrapper[17876]: I0313 10:42:00.009009 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:42:00.009596 master-0 kubenswrapper[17876]: E0313 10:42:00.009305 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:00.009596 master-0 kubenswrapper[17876]: E0313 10:42:00.009354 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:00.009596 master-0 kubenswrapper[17876]: E0313 10:42:00.009428 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:42:02.009406615 +0000 UTC m=+29.845213091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:00.315067 master-0 kubenswrapper[17876]: I0313 10:42:00.315003 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" event={"ID":"5c65aadf-c6fc-4959-9366-3e9d378bb507","Type":"ContainerStarted","Data":"82624d7673f1017278f6b0f43f163c4198a14501a5661663564a8a4eab82a773"} Mar 13 10:42:00.315407 master-0 kubenswrapper[17876]: I0313 10:42:00.315376 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:42:00.316615 master-0 kubenswrapper[17876]: I0313 10:42:00.316569 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5ncj" event={"ID":"c179be5b-2517-4ae5-9c30-2d4415899123","Type":"ContainerStarted","Data":"9de2f898205c8c62694ff948fd702768e26a9ce43c73c309c8507d29df3465e8"} Mar 13 10:42:00.316615 master-0 kubenswrapper[17876]: I0313 10:42:00.316613 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-p5ncj" event={"ID":"c179be5b-2517-4ae5-9c30-2d4415899123","Type":"ContainerStarted","Data":"6df90e25590ccd1a19699799deb5333dcec70d6e9f34db406753f93eb87eca11"} Mar 13 10:42:00.322956 master-0 kubenswrapper[17876]: I0313 10:42:00.322902 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" Mar 13 10:42:02.030771 master-0 kubenswrapper[17876]: I0313 10:42:02.030679 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:42:02.031449 master-0 kubenswrapper[17876]: E0313 10:42:02.030897 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:02.031449 master-0 kubenswrapper[17876]: E0313 10:42:02.030929 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:02.031449 master-0 kubenswrapper[17876]: E0313 10:42:02.030981 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:42:06.03096571 +0000 UTC m=+33.866772186 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:02.389581 master-0 kubenswrapper[17876]: I0313 10:42:02.385374 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:42:02.421125 master-0 kubenswrapper[17876]: I0313 10:42:02.419672 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7z94w"] Mar 13 10:42:02.428941 master-0 kubenswrapper[17876]: I0313 10:42:02.428883 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-7z94w"] Mar 13 10:42:02.434371 master-0 kubenswrapper[17876]: I0313 10:42:02.434320 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-576d4447f8-zqphk" Mar 13 10:42:02.511119 master-0 kubenswrapper[17876]: I0313 10:42:02.509727 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="277614e8-838f-4773-bcfc-89f19c620dee" path="/var/lib/kubelet/pods/277614e8-838f-4773-bcfc-89f19c620dee/volumes" Mar 13 10:42:02.548235 master-0 kubenswrapper[17876]: I0313 10:42:02.544356 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6558455fc8-8qww9" podStartSLOduration=8.781398705 podStartE2EDuration="10.544330219s" podCreationTimestamp="2026-03-13 10:41:52 +0000 UTC" firstStartedPulling="2026-03-13 10:41:58.292297173 +0000 UTC m=+26.128103639" lastFinishedPulling="2026-03-13 10:42:00.055228667 +0000 UTC m=+27.891035153" observedRunningTime="2026-03-13 10:42:02.542742893 +0000 UTC m=+30.378549389" watchObservedRunningTime="2026-03-13 10:42:02.544330219 +0000 UTC m=+30.380136695" Mar 13 10:42:02.751334 master-0 kubenswrapper[17876]: I0313 10:42:02.751210 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:42:02.752044 master-0 kubenswrapper[17876]: I0313 10:42:02.752008 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-6gzxr"] Mar 13 10:42:02.916320 master-0 kubenswrapper[17876]: I0313 10:42:02.916205 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-p5ncj" podStartSLOduration=14.916175349 podStartE2EDuration="14.916175349s" podCreationTimestamp="2026-03-13 10:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:02.913672276 +0000 UTC m=+30.749478762" watchObservedRunningTime="2026-03-13 10:42:02.916175349 +0000 UTC m=+30.751981825" Mar 13 10:42:03.045221 master-0 kubenswrapper[17876]: I0313 10:42:03.036465 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-999d99f5f-hlk52" Mar 13 10:42:04.157999 master-0 kubenswrapper[17876]: I0313 10:42:04.157946 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 10:42:04.324842 master-0 kubenswrapper[17876]: I0313 10:42:04.324784 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:42:04.325138 master-0 kubenswrapper[17876]: I0313 10:42:04.324991 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:42:04.332524 master-0 kubenswrapper[17876]: I0313 10:42:04.332310 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:42:04.524830 master-0 kubenswrapper[17876]: I0313 10:42:04.524690 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5da919b6-8545-4001-89f3-74cb289327f0" path="/var/lib/kubelet/pods/5da919b6-8545-4001-89f3-74cb289327f0/volumes" Mar 13 10:42:05.539920 master-0 kubenswrapper[17876]: I0313 10:42:05.539772 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:42:05.540421 master-0 kubenswrapper[17876]: I0313 10:42:05.539973 17876 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 13 10:42:05.558952 master-0 kubenswrapper[17876]: I0313 10:42:05.558900 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-vww4t" Mar 13 10:42:06.031938 master-0 kubenswrapper[17876]: I0313 10:42:06.031878 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:42:06.032191 master-0 kubenswrapper[17876]: E0313 10:42:06.032044 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:06.032191 master-0 kubenswrapper[17876]: E0313 10:42:06.032065 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:06.032191 master-0 kubenswrapper[17876]: E0313 10:42:06.032128 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:42:14.032112095 +0000 UTC m=+41.867918571 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:07.161717 master-0 kubenswrapper[17876]: I0313 10:42:07.161669 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-kwwkz" Mar 13 10:42:07.389357 master-0 kubenswrapper[17876]: I0313 10:42:07.389313 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-trztz" event={"ID":"d23e2957-3a22-44f6-937c-5ab6314681c0","Type":"ContainerStarted","Data":"111416b02e12a1b998f81002727063da8fee579ab5d0d10fbb48f6caa708d2f9"} Mar 13 10:42:07.391890 master-0 kubenswrapper[17876]: I0313 10:42:07.391866 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" event={"ID":"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9","Type":"ContainerStarted","Data":"789afc8d8a6e306039624650966ee23018a7ba7ea1dbcde6122e9d4057d3711b"} Mar 13 10:42:07.392224 master-0 kubenswrapper[17876]: I0313 10:42:07.392141 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:42:07.394521 master-0 kubenswrapper[17876]: I0313 10:42:07.394500 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" event={"ID":"7c5279e3-0165-4347-bfc7-87b80accaab3","Type":"ContainerStarted","Data":"c0ff75c7178099f92f172beaf8eb70f61584db851a66480249bca00a5b1f2786"} Mar 13 10:42:07.394709 master-0 kubenswrapper[17876]: I0313 10:42:07.394675 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" event={"ID":"7c5279e3-0165-4347-bfc7-87b80accaab3","Type":"ContainerStarted","Data":"4b1b46102a483a6c0cbb8eda3fc9a8e090da503ff824a72fe7e60c5c99102745"} Mar 13 10:42:07.394805 master-0 kubenswrapper[17876]: I0313 10:42:07.394792 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-dw9w6" event={"ID":"7c5279e3-0165-4347-bfc7-87b80accaab3","Type":"ContainerStarted","Data":"23bd645b420f3259f2284a02e55437455cf2312e2ea6079c49a90e903c6f8145"} Mar 13 10:42:07.397234 master-0 kubenswrapper[17876]: I0313 10:42:07.397189 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" event={"ID":"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9","Type":"ContainerStarted","Data":"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771"} Mar 13 10:42:07.397397 master-0 kubenswrapper[17876]: I0313 10:42:07.397361 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:42:07.400186 master-0 kubenswrapper[17876]: I0313 10:42:07.399910 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:42:07.412016 master-0 kubenswrapper[17876]: I0313 10:42:07.411852 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-trztz" podStartSLOduration=8.935475409 podStartE2EDuration="16.411830307s" podCreationTimestamp="2026-03-13 10:41:51 +0000 UTC" firstStartedPulling="2026-03-13 10:41:58.837677195 +0000 UTC m=+26.673483671" lastFinishedPulling="2026-03-13 10:42:06.314032103 +0000 UTC m=+34.149838569" observedRunningTime="2026-03-13 10:42:07.409443108 +0000 UTC m=+35.245249604" watchObservedRunningTime="2026-03-13 10:42:07.411830307 +0000 UTC m=+35.247636783" Mar 13 10:42:07.478686 master-0 kubenswrapper[17876]: I0313 10:42:07.478589 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" podStartSLOduration=19.123691968 podStartE2EDuration="26.478566825s" podCreationTimestamp="2026-03-13 10:41:41 +0000 UTC" firstStartedPulling="2026-03-13 10:41:58.992019051 +0000 UTC m=+26.827825527" lastFinishedPulling="2026-03-13 10:42:06.346893907 +0000 UTC m=+34.182700384" observedRunningTime="2026-03-13 10:42:07.478529484 +0000 UTC m=+35.314335960" watchObservedRunningTime="2026-03-13 10:42:07.478566825 +0000 UTC m=+35.314373301" Mar 13 10:42:07.490604 master-0 kubenswrapper[17876]: I0313 10:42:07.490541 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-pgdzh"] Mar 13 10:42:07.491179 master-0 kubenswrapper[17876]: E0313 10:42:07.491155 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="kube-rbac-proxy" Mar 13 10:42:07.491274 master-0 kubenswrapper[17876]: I0313 10:42:07.491263 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="kube-rbac-proxy" Mar 13 10:42:07.491374 master-0 kubenswrapper[17876]: E0313 10:42:07.491363 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" containerName="installer" Mar 13 10:42:07.491436 master-0 kubenswrapper[17876]: I0313 10:42:07.491426 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" containerName="installer" Mar 13 10:42:07.491509 master-0 kubenswrapper[17876]: E0313 10:42:07.491499 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="multus-admission-controller" Mar 13 10:42:07.491586 master-0 kubenswrapper[17876]: I0313 10:42:07.491575 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="multus-admission-controller" Mar 13 10:42:07.491664 master-0 kubenswrapper[17876]: E0313 10:42:07.491651 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="277614e8-838f-4773-bcfc-89f19c620dee" containerName="kube-multus-additional-cni-plugins" Mar 13 10:42:07.491755 master-0 kubenswrapper[17876]: I0313 10:42:07.491736 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="277614e8-838f-4773-bcfc-89f19c620dee" containerName="kube-multus-additional-cni-plugins" Mar 13 10:42:07.491941 master-0 kubenswrapper[17876]: I0313 10:42:07.491928 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b44838d-cfe0-42fe-9927-d0b5391eee81" containerName="installer" Mar 13 10:42:07.492029 master-0 kubenswrapper[17876]: I0313 10:42:07.492018 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="277614e8-838f-4773-bcfc-89f19c620dee" containerName="kube-multus-additional-cni-plugins" Mar 13 10:42:07.492135 master-0 kubenswrapper[17876]: I0313 10:42:07.492124 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="multus-admission-controller" Mar 13 10:42:07.492207 master-0 kubenswrapper[17876]: I0313 10:42:07.492197 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5da919b6-8545-4001-89f3-74cb289327f0" containerName="kube-rbac-proxy" Mar 13 10:42:07.492654 master-0 kubenswrapper[17876]: I0313 10:42:07.492639 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:07.496851 master-0 kubenswrapper[17876]: I0313 10:42:07.494367 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-5lpgr" Mar 13 10:42:07.496851 master-0 kubenswrapper[17876]: I0313 10:42:07.495391 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 10:42:07.496851 master-0 kubenswrapper[17876]: I0313 10:42:07.495585 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 10:42:07.504675 master-0 kubenswrapper[17876]: I0313 10:42:07.504593 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" podStartSLOduration=14.376337274 podStartE2EDuration="21.50456676s" podCreationTimestamp="2026-03-13 10:41:46 +0000 UTC" firstStartedPulling="2026-03-13 10:41:59.233536343 +0000 UTC m=+27.069342819" lastFinishedPulling="2026-03-13 10:42:06.361765829 +0000 UTC m=+34.197572305" observedRunningTime="2026-03-13 10:42:07.501874372 +0000 UTC m=+35.337680848" watchObservedRunningTime="2026-03-13 10:42:07.50456676 +0000 UTC m=+35.340373236" Mar 13 10:42:07.517969 master-0 kubenswrapper[17876]: I0313 10:42:07.517913 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-lhqzl" Mar 13 10:42:07.532384 master-0 kubenswrapper[17876]: I0313 10:42:07.532305 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-pgdzh"] Mar 13 10:42:07.551930 master-0 kubenswrapper[17876]: I0313 10:42:07.549318 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsb8t\" (UniqueName: \"kubernetes.io/projected/3db4d656-fd93-41a2-8e49-0d1685c4e3d3-kube-api-access-bsb8t\") pod \"downloads-84f57b9877-pgdzh\" (UID: \"3db4d656-fd93-41a2-8e49-0d1685c4e3d3\") " pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:07.626874 master-0 kubenswrapper[17876]: I0313 10:42:07.626818 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:42:07.653124 master-0 kubenswrapper[17876]: I0313 10:42:07.651787 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsb8t\" (UniqueName: \"kubernetes.io/projected/3db4d656-fd93-41a2-8e49-0d1685c4e3d3-kube-api-access-bsb8t\") pod \"downloads-84f57b9877-pgdzh\" (UID: \"3db4d656-fd93-41a2-8e49-0d1685c4e3d3\") " pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:07.669924 master-0 kubenswrapper[17876]: I0313 10:42:07.669841 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsb8t\" (UniqueName: \"kubernetes.io/projected/3db4d656-fd93-41a2-8e49-0d1685c4e3d3-kube-api-access-bsb8t\") pod \"downloads-84f57b9877-pgdzh\" (UID: \"3db4d656-fd93-41a2-8e49-0d1685c4e3d3\") " pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:07.765597 master-0 kubenswrapper[17876]: I0313 10:42:07.765504 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-kqrsd" Mar 13 10:42:07.823468 master-0 kubenswrapper[17876]: I0313 10:42:07.820420 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:08.345500 master-0 kubenswrapper[17876]: I0313 10:42:08.345348 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-pgdzh"] Mar 13 10:42:08.350660 master-0 kubenswrapper[17876]: W0313 10:42:08.350618 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3db4d656_fd93_41a2_8e49_0d1685c4e3d3.slice/crio-5807ef9ef3dbf6a2b09b91edb021b75d5ffa348c643975f84fdee853d3879e59 WatchSource:0}: Error finding container 5807ef9ef3dbf6a2b09b91edb021b75d5ffa348c643975f84fdee853d3879e59: Status 404 returned error can't find the container with id 5807ef9ef3dbf6a2b09b91edb021b75d5ffa348c643975f84fdee853d3879e59 Mar 13 10:42:08.372570 master-0 kubenswrapper[17876]: I0313 10:42:08.372519 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-dnhzw" Mar 13 10:42:08.404327 master-0 kubenswrapper[17876]: I0313 10:42:08.404235 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-pgdzh" event={"ID":"3db4d656-fd93-41a2-8e49-0d1685c4e3d3","Type":"ContainerStarted","Data":"5807ef9ef3dbf6a2b09b91edb021b75d5ffa348c643975f84fdee853d3879e59"} Mar 13 10:42:09.580186 master-0 kubenswrapper[17876]: I0313 10:42:09.580071 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:42:09.581551 master-0 kubenswrapper[17876]: I0313 10:42:09.581266 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" containerName="startup-monitor" containerID="cri-o://d472edafe5d759160888ed04e3afd874976a6f531f1a77a132237f479f6f2ec3" gracePeriod=5 Mar 13 10:42:12.331366 master-0 kubenswrapper[17876]: I0313 10:42:12.331252 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:42:12.332583 master-0 kubenswrapper[17876]: E0313 10:42:12.331652 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" containerName="startup-monitor" Mar 13 10:42:12.332583 master-0 kubenswrapper[17876]: I0313 10:42:12.331704 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" containerName="startup-monitor" Mar 13 10:42:12.332583 master-0 kubenswrapper[17876]: I0313 10:42:12.331945 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" containerName="startup-monitor" Mar 13 10:42:12.334415 master-0 kubenswrapper[17876]: I0313 10:42:12.334376 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.336927 master-0 kubenswrapper[17876]: I0313 10:42:12.336884 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 10:42:12.339140 master-0 kubenswrapper[17876]: I0313 10:42:12.337103 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-qddwq" Mar 13 10:42:12.339140 master-0 kubenswrapper[17876]: I0313 10:42:12.337766 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 10:42:12.339140 master-0 kubenswrapper[17876]: I0313 10:42:12.337880 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 10:42:12.339140 master-0 kubenswrapper[17876]: I0313 10:42:12.338037 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 10:42:12.339140 master-0 kubenswrapper[17876]: I0313 10:42:12.338167 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 10:42:12.349058 master-0 kubenswrapper[17876]: I0313 10:42:12.348959 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:42:12.436937 master-0 kubenswrapper[17876]: I0313 10:42:12.436877 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.437138 master-0 kubenswrapper[17876]: I0313 10:42:12.436961 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.437138 master-0 kubenswrapper[17876]: I0313 10:42:12.436998 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.437138 master-0 kubenswrapper[17876]: I0313 10:42:12.437020 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.437138 master-0 kubenswrapper[17876]: I0313 10:42:12.437050 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.437138 master-0 kubenswrapper[17876]: I0313 10:42:12.437068 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb59r\" (UniqueName: \"kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.653593 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb59r\" (UniqueName: \"kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.653691 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.653894 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.655501 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.655562 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.655594 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.656619 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.657477 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.659130 master-0 kubenswrapper[17876]: I0313 10:42:12.657675 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.677609 master-0 kubenswrapper[17876]: I0313 10:42:12.675422 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.690118 master-0 kubenswrapper[17876]: I0313 10:42:12.684759 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.695281 master-0 kubenswrapper[17876]: I0313 10:42:12.695228 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb59r\" (UniqueName: \"kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r\") pod \"console-7776f76bf7-f4jhw\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.740120 master-0 kubenswrapper[17876]: I0313 10:42:12.736545 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:12.964432 master-0 kubenswrapper[17876]: I0313 10:42:12.963289 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q"] Mar 13 10:42:12.964432 master-0 kubenswrapper[17876]: I0313 10:42:12.964180 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:12.970525 master-0 kubenswrapper[17876]: I0313 10:42:12.969494 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-c9gmn" Mar 13 10:42:12.971297 master-0 kubenswrapper[17876]: I0313 10:42:12.971270 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 10:42:12.971564 master-0 kubenswrapper[17876]: I0313 10:42:12.971523 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 10:42:12.986887 master-0 kubenswrapper[17876]: I0313 10:42:12.985297 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q"] Mar 13 10:42:13.315401 master-0 kubenswrapper[17876]: I0313 10:42:13.314808 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.315401 master-0 kubenswrapper[17876]: I0313 10:42:13.314934 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/831aa5fe-9170-4e69-9796-b423c03b5060-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.371526 master-0 kubenswrapper[17876]: I0313 10:42:13.371348 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:42:13.417695 master-0 kubenswrapper[17876]: I0313 10:42:13.417327 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.417695 master-0 kubenswrapper[17876]: E0313 10:42:13.417540 17876 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 13 10:42:13.417695 master-0 kubenswrapper[17876]: E0313 10:42:13.417624 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert podName:831aa5fe-9170-4e69-9796-b423c03b5060 nodeName:}" failed. No retries permitted until 2026-03-13 10:42:13.917600554 +0000 UTC m=+41.753407030 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-gkb5q" (UID: "831aa5fe-9170-4e69-9796-b423c03b5060") : secret "networking-console-plugin-cert" not found Mar 13 10:42:13.418196 master-0 kubenswrapper[17876]: I0313 10:42:13.417843 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/831aa5fe-9170-4e69-9796-b423c03b5060-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.418836 master-0 kubenswrapper[17876]: I0313 10:42:13.418795 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/831aa5fe-9170-4e69-9796-b423c03b5060-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.759756 master-0 kubenswrapper[17876]: I0313 10:42:13.759628 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7776f76bf7-f4jhw" event={"ID":"9413fefe-20d4-4f4c-939a-c9d45eda6032","Type":"ContainerStarted","Data":"f42a30a155db9e007eb11ac80aa29a3ec9d8d56dd287aa15dcc220c47296ddfa"} Mar 13 10:42:13.950067 master-0 kubenswrapper[17876]: I0313 10:42:13.949122 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:13.955249 master-0 kubenswrapper[17876]: I0313 10:42:13.955180 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/831aa5fe-9170-4e69-9796-b423c03b5060-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-gkb5q\" (UID: \"831aa5fe-9170-4e69-9796-b423c03b5060\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:14.053186 master-0 kubenswrapper[17876]: I0313 10:42:14.050281 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:42:14.053186 master-0 kubenswrapper[17876]: E0313 10:42:14.050557 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:14.053186 master-0 kubenswrapper[17876]: E0313 10:42:14.050615 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:14.053186 master-0 kubenswrapper[17876]: E0313 10:42:14.050710 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:42:30.050679491 +0000 UTC m=+57.886485967 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:14.163866 master-0 kubenswrapper[17876]: I0313 10:42:14.161136 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:42:14.211698 master-0 kubenswrapper[17876]: I0313 10:42:14.207386 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" Mar 13 10:42:14.546589 master-0 kubenswrapper[17876]: I0313 10:42:14.546524 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q"] Mar 13 10:42:14.580119 master-0 kubenswrapper[17876]: W0313 10:42:14.579511 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod831aa5fe_9170_4e69_9796_b423c03b5060.slice/crio-7ee2830b66c0eeef81b8a1218670e022fcd66e4f7449c8542c18dcdacafab096 WatchSource:0}: Error finding container 7ee2830b66c0eeef81b8a1218670e022fcd66e4f7449c8542c18dcdacafab096: Status 404 returned error can't find the container with id 7ee2830b66c0eeef81b8a1218670e022fcd66e4f7449c8542c18dcdacafab096 Mar 13 10:42:14.770255 master-0 kubenswrapper[17876]: I0313 10:42:14.767976 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_acbb43bf2cf27ed60d1f635fd6638ac7/startup-monitor/0.log" Mar 13 10:42:14.770255 master-0 kubenswrapper[17876]: I0313 10:42:14.768041 17876 generic.go:334] "Generic (PLEG): container finished" podID="acbb43bf2cf27ed60d1f635fd6638ac7" containerID="d472edafe5d759160888ed04e3afd874976a6f531f1a77a132237f479f6f2ec3" exitCode=137 Mar 13 10:42:14.770255 master-0 kubenswrapper[17876]: I0313 10:42:14.769293 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" event={"ID":"831aa5fe-9170-4e69-9796-b423c03b5060","Type":"ContainerStarted","Data":"7ee2830b66c0eeef81b8a1218670e022fcd66e4f7449c8542c18dcdacafab096"} Mar 13 10:42:15.173333 master-0 kubenswrapper[17876]: I0313 10:42:15.173220 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_acbb43bf2cf27ed60d1f635fd6638ac7/startup-monitor/0.log" Mar 13 10:42:15.173333 master-0 kubenswrapper[17876]: I0313 10:42:15.173299 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:42:15.275594 master-0 kubenswrapper[17876]: I0313 10:42:15.275532 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") pod \"acbb43bf2cf27ed60d1f635fd6638ac7\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275617 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") pod \"acbb43bf2cf27ed60d1f635fd6638ac7\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275842 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") pod \"acbb43bf2cf27ed60d1f635fd6638ac7\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275866 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") pod \"acbb43bf2cf27ed60d1f635fd6638ac7\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275887 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") pod \"acbb43bf2cf27ed60d1f635fd6638ac7\" (UID: \"acbb43bf2cf27ed60d1f635fd6638ac7\") " Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275956 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests" (OuterVolumeSpecName: "manifests") pod "acbb43bf2cf27ed60d1f635fd6638ac7" (UID: "acbb43bf2cf27ed60d1f635fd6638ac7"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.275988 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "acbb43bf2cf27ed60d1f635fd6638ac7" (UID: "acbb43bf2cf27ed60d1f635fd6638ac7"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:15.276070 master-0 kubenswrapper[17876]: I0313 10:42:15.276043 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log" (OuterVolumeSpecName: "var-log") pod "acbb43bf2cf27ed60d1f635fd6638ac7" (UID: "acbb43bf2cf27ed60d1f635fd6638ac7"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:15.276347 master-0 kubenswrapper[17876]: I0313 10:42:15.276072 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock" (OuterVolumeSpecName: "var-lock") pod "acbb43bf2cf27ed60d1f635fd6638ac7" (UID: "acbb43bf2cf27ed60d1f635fd6638ac7"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:15.276347 master-0 kubenswrapper[17876]: I0313 10:42:15.276330 17876 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:15.276413 master-0 kubenswrapper[17876]: I0313 10:42:15.276353 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:15.276413 master-0 kubenswrapper[17876]: I0313 10:42:15.276365 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:15.276413 master-0 kubenswrapper[17876]: I0313 10:42:15.276374 17876 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:15.288455 master-0 kubenswrapper[17876]: I0313 10:42:15.288388 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "acbb43bf2cf27ed60d1f635fd6638ac7" (UID: "acbb43bf2cf27ed60d1f635fd6638ac7"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:15.377896 master-0 kubenswrapper[17876]: I0313 10:42:15.377765 17876 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/acbb43bf2cf27ed60d1f635fd6638ac7-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:15.782844 master-0 kubenswrapper[17876]: I0313 10:42:15.782584 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_acbb43bf2cf27ed60d1f635fd6638ac7/startup-monitor/0.log" Mar 13 10:42:15.782844 master-0 kubenswrapper[17876]: I0313 10:42:15.782690 17876 scope.go:117] "RemoveContainer" containerID="d472edafe5d759160888ed04e3afd874976a6f531f1a77a132237f479f6f2ec3" Mar 13 10:42:15.782844 master-0 kubenswrapper[17876]: I0313 10:42:15.782731 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:42:15.958002 master-0 kubenswrapper[17876]: I0313 10:42:15.957686 17876 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="423737f2-6a36-49de-a0d5-13d5b9ab9338" Mar 13 10:42:16.507910 master-0 kubenswrapper[17876]: I0313 10:42:16.507850 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acbb43bf2cf27ed60d1f635fd6638ac7" path="/var/lib/kubelet/pods/acbb43bf2cf27ed60d1f635fd6638ac7/volumes" Mar 13 10:42:16.508296 master-0 kubenswrapper[17876]: I0313 10:42:16.508270 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 13 10:42:16.632972 master-0 kubenswrapper[17876]: I0313 10:42:16.632852 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:42:16.632972 master-0 kubenswrapper[17876]: I0313 10:42:16.632886 17876 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="423737f2-6a36-49de-a0d5-13d5b9ab9338" Mar 13 10:42:16.640210 master-0 kubenswrapper[17876]: I0313 10:42:16.638739 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:42:16.640210 master-0 kubenswrapper[17876]: I0313 10:42:16.638793 17876 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="423737f2-6a36-49de-a0d5-13d5b9ab9338" Mar 13 10:42:19.821084 master-0 kubenswrapper[17876]: I0313 10:42:19.821005 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" event={"ID":"831aa5fe-9170-4e69-9796-b423c03b5060","Type":"ContainerStarted","Data":"359176924b8b952f87cdb666e42ab40e9297d64ecdd58432a69424d7b93b8544"} Mar 13 10:42:19.824112 master-0 kubenswrapper[17876]: I0313 10:42:19.823954 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7776f76bf7-f4jhw" event={"ID":"9413fefe-20d4-4f4c-939a-c9d45eda6032","Type":"ContainerStarted","Data":"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426"} Mar 13 10:42:19.842508 master-0 kubenswrapper[17876]: I0313 10:42:19.842399 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-gkb5q" podStartSLOduration=3.376774836 podStartE2EDuration="7.842359991s" podCreationTimestamp="2026-03-13 10:42:12 +0000 UTC" firstStartedPulling="2026-03-13 10:42:14.582778655 +0000 UTC m=+42.418585131" lastFinishedPulling="2026-03-13 10:42:19.04836379 +0000 UTC m=+46.884170286" observedRunningTime="2026-03-13 10:42:19.841073873 +0000 UTC m=+47.676880369" watchObservedRunningTime="2026-03-13 10:42:19.842359991 +0000 UTC m=+47.678166467" Mar 13 10:42:19.899249 master-0 kubenswrapper[17876]: I0313 10:42:19.895249 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7776f76bf7-f4jhw" podStartSLOduration=2.2133883069999998 podStartE2EDuration="7.895216856s" podCreationTimestamp="2026-03-13 10:42:12 +0000 UTC" firstStartedPulling="2026-03-13 10:42:13.391452075 +0000 UTC m=+41.227258551" lastFinishedPulling="2026-03-13 10:42:19.073280624 +0000 UTC m=+46.909087100" observedRunningTime="2026-03-13 10:42:19.894826225 +0000 UTC m=+47.730632711" watchObservedRunningTime="2026-03-13 10:42:19.895216856 +0000 UTC m=+47.731023322" Mar 13 10:42:20.007706 master-0 kubenswrapper[17876]: I0313 10:42:20.007641 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:42:20.008686 master-0 kubenswrapper[17876]: I0313 10:42:20.008661 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.022922 master-0 kubenswrapper[17876]: I0313 10:42:20.021885 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 10:42:20.034899 master-0 kubenswrapper[17876]: I0313 10:42:20.034852 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:42:20.184913 master-0 kubenswrapper[17876]: I0313 10:42:20.184855 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.184913 master-0 kubenswrapper[17876]: I0313 10:42:20.184915 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c56zq\" (UniqueName: \"kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.185210 master-0 kubenswrapper[17876]: I0313 10:42:20.184937 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.185210 master-0 kubenswrapper[17876]: I0313 10:42:20.184958 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.185210 master-0 kubenswrapper[17876]: I0313 10:42:20.184977 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.185210 master-0 kubenswrapper[17876]: I0313 10:42:20.185011 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.185210 master-0 kubenswrapper[17876]: I0313 10:42:20.185035 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.285950 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286083 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c56zq\" (UniqueName: \"kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286176 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286254 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286336 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286543 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.287234 master-0 kubenswrapper[17876]: I0313 10:42:20.286595 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.289785 master-0 kubenswrapper[17876]: I0313 10:42:20.288716 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.289785 master-0 kubenswrapper[17876]: I0313 10:42:20.288865 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.289785 master-0 kubenswrapper[17876]: I0313 10:42:20.289244 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.290855 master-0 kubenswrapper[17876]: I0313 10:42:20.290767 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.303174 master-0 kubenswrapper[17876]: I0313 10:42:20.303006 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.312423 master-0 kubenswrapper[17876]: I0313 10:42:20.312371 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c56zq\" (UniqueName: \"kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.314416 master-0 kubenswrapper[17876]: I0313 10:42:20.314382 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert\") pod \"console-bfb55f4b6-qf9q7\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.329903 master-0 kubenswrapper[17876]: I0313 10:42:20.329860 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:20.752571 master-0 kubenswrapper[17876]: I0313 10:42:20.752431 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:42:20.761236 master-0 kubenswrapper[17876]: W0313 10:42:20.761195 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9bf54984_47df_48ea_861b_9d6546c0f82b.slice/crio-2817077f8bd8850602e04e02af065be1388aff47b0e480ea6a5c96c2be065880 WatchSource:0}: Error finding container 2817077f8bd8850602e04e02af065be1388aff47b0e480ea6a5c96c2be065880: Status 404 returned error can't find the container with id 2817077f8bd8850602e04e02af065be1388aff47b0e480ea6a5c96c2be065880 Mar 13 10:42:20.832383 master-0 kubenswrapper[17876]: I0313 10:42:20.832233 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfb55f4b6-qf9q7" event={"ID":"9bf54984-47df-48ea-861b-9d6546c0f82b","Type":"ContainerStarted","Data":"2817077f8bd8850602e04e02af065be1388aff47b0e480ea6a5c96c2be065880"} Mar 13 10:42:21.848012 master-0 kubenswrapper[17876]: I0313 10:42:21.843792 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfb55f4b6-qf9q7" event={"ID":"9bf54984-47df-48ea-861b-9d6546c0f82b","Type":"ContainerStarted","Data":"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af"} Mar 13 10:42:22.162730 master-0 kubenswrapper[17876]: I0313 10:42:22.162603 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-bfb55f4b6-qf9q7" podStartSLOduration=3.162574837 podStartE2EDuration="3.162574837s" podCreationTimestamp="2026-03-13 10:42:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:22.160632971 +0000 UTC m=+49.996439447" watchObservedRunningTime="2026-03-13 10:42:22.162574837 +0000 UTC m=+49.998381323" Mar 13 10:42:22.737022 master-0 kubenswrapper[17876]: I0313 10:42:22.736793 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:22.737022 master-0 kubenswrapper[17876]: I0313 10:42:22.736857 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:42:22.740908 master-0 kubenswrapper[17876]: I0313 10:42:22.740824 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:42:22.741040 master-0 kubenswrapper[17876]: I0313 10:42:22.740901 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:42:30.108574 master-0 kubenswrapper[17876]: I0313 10:42:30.108448 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:42:30.109993 master-0 kubenswrapper[17876]: E0313 10:42:30.109506 17876 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:30.109993 master-0 kubenswrapper[17876]: E0313 10:42:30.109537 17876 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-2-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:30.109993 master-0 kubenswrapper[17876]: E0313 10:42:30.109619 17876 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access podName:3b44838d-cfe0-42fe-9927-d0b5391eee81 nodeName:}" failed. No retries permitted until 2026-03-13 10:43:02.109593235 +0000 UTC m=+89.945399711 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access") pod "installer-2-retry-1-master-0" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 13 10:42:30.330315 master-0 kubenswrapper[17876]: I0313 10:42:30.330278 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:30.330611 master-0 kubenswrapper[17876]: I0313 10:42:30.330596 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:42:30.331406 master-0 kubenswrapper[17876]: I0313 10:42:30.331273 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:42:30.331577 master-0 kubenswrapper[17876]: I0313 10:42:30.331527 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:42:32.281668 master-0 kubenswrapper[17876]: I0313 10:42:32.281541 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 10:42:32.282875 master-0 kubenswrapper[17876]: I0313 10:42:32.282842 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.294231 master-0 kubenswrapper[17876]: I0313 10:42:32.293339 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:42:32.294617 master-0 kubenswrapper[17876]: I0313 10:42:32.294583 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:42:32.297003 master-0 kubenswrapper[17876]: I0313 10:42:32.296929 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 10:42:32.429383 master-0 kubenswrapper[17876]: I0313 10:42:32.429153 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" podUID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" containerName="oauth-openshift" containerID="cri-o://a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771" gracePeriod=15 Mar 13 10:42:32.455953 master-0 kubenswrapper[17876]: I0313 10:42:32.455869 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.456225 master-0 kubenswrapper[17876]: I0313 10:42:32.455961 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.456225 master-0 kubenswrapper[17876]: I0313 10:42:32.456030 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.499660 master-0 kubenswrapper[17876]: I0313 10:42:32.498865 17876 scope.go:117] "RemoveContainer" containerID="ee4e3ab4663e1587ce994fc6b4abf7c85bf2b949922e7c558f6898fa4c2d1ce1" Mar 13 10:42:32.557206 master-0 kubenswrapper[17876]: I0313 10:42:32.557089 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.557388 master-0 kubenswrapper[17876]: I0313 10:42:32.557293 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.557486 master-0 kubenswrapper[17876]: I0313 10:42:32.557436 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.557538 master-0 kubenswrapper[17876]: I0313 10:42:32.557491 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.557538 master-0 kubenswrapper[17876]: I0313 10:42:32.557518 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.560198 master-0 kubenswrapper[17876]: I0313 10:42:32.560144 17876 scope.go:117] "RemoveContainer" containerID="9e202824b084c4177db3bd9002d881090f9c8da16dc67819aecdad944afe647d" Mar 13 10:42:32.579115 master-0 kubenswrapper[17876]: I0313 10:42:32.577073 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:42:32.579481 master-0 kubenswrapper[17876]: I0313 10:42:32.579281 17876 scope.go:117] "RemoveContainer" containerID="038536df2c456779ce7e0291a2536f4028dbe7eacec6c366598f83e56cd809ba" Mar 13 10:42:32.598142 master-0 kubenswrapper[17876]: I0313 10:42:32.588907 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access\") pod \"installer-3-master-0\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.614713 master-0 kubenswrapper[17876]: I0313 10:42:32.614671 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:42:32.622962 master-0 kubenswrapper[17876]: I0313 10:42:32.622886 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:42:32.747086 master-0 kubenswrapper[17876]: I0313 10:42:32.747035 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:42:32.747226 master-0 kubenswrapper[17876]: I0313 10:42:32.747116 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:42:32.930005 master-0 kubenswrapper[17876]: I0313 10:42:32.929937 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:42:32.965249 master-0 kubenswrapper[17876]: I0313 10:42:32.965042 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:32.965249 master-0 kubenswrapper[17876]: I0313 10:42:32.965120 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:32.965249 master-0 kubenswrapper[17876]: I0313 10:42:32.965149 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:32.965249 master-0 kubenswrapper[17876]: I0313 10:42:32.965168 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:32.966672 master-0 kubenswrapper[17876]: I0313 10:42:32.966589 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:32.970547 master-0 kubenswrapper[17876]: I0313 10:42:32.968928 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:32.974320 master-0 kubenswrapper[17876]: I0313 10:42:32.974263 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:32.975346 master-0 kubenswrapper[17876]: I0313 10:42:32.974847 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-65f5b9dbcc-62t45"] Mar 13 10:42:32.975346 master-0 kubenswrapper[17876]: E0313 10:42:32.975182 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" containerName="oauth-openshift" Mar 13 10:42:32.975346 master-0 kubenswrapper[17876]: I0313 10:42:32.975208 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" containerName="oauth-openshift" Mar 13 10:42:32.975346 master-0 kubenswrapper[17876]: I0313 10:42:32.975358 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" containerName="oauth-openshift" Mar 13 10:42:32.975840 master-0 kubenswrapper[17876]: I0313 10:42:32.975765 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:32.976250 master-0 kubenswrapper[17876]: I0313 10:42:32.976207 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:32.986358 master-0 kubenswrapper[17876]: I0313 10:42:32.986035 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65f5b9dbcc-62t45"] Mar 13 10:42:32.994737 master-0 kubenswrapper[17876]: I0313 10:42:32.992737 17876 generic.go:334] "Generic (PLEG): container finished" podID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" containerID="a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771" exitCode=0 Mar 13 10:42:32.994737 master-0 kubenswrapper[17876]: I0313 10:42:32.992844 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" event={"ID":"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9","Type":"ContainerDied","Data":"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771"} Mar 13 10:42:32.994737 master-0 kubenswrapper[17876]: I0313 10:42:32.992879 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" event={"ID":"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9","Type":"ContainerDied","Data":"8ac13dfa7991399d25aa89c8b83f457b47e56c78ad30fea41fae3c6f40d2a64e"} Mar 13 10:42:32.994737 master-0 kubenswrapper[17876]: I0313 10:42:32.992900 17876 scope.go:117] "RemoveContainer" containerID="a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771" Mar 13 10:42:32.994737 master-0 kubenswrapper[17876]: I0313 10:42:32.993086 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-5db65d9766-lg686" Mar 13 10:42:33.018944 master-0 kubenswrapper[17876]: I0313 10:42:33.018906 17876 scope.go:117] "RemoveContainer" containerID="a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771" Mar 13 10:42:33.019532 master-0 kubenswrapper[17876]: E0313 10:42:33.019503 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771\": container with ID starting with a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771 not found: ID does not exist" containerID="a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771" Mar 13 10:42:33.019593 master-0 kubenswrapper[17876]: I0313 10:42:33.019540 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771"} err="failed to get container status \"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771\": rpc error: code = NotFound desc = could not find container \"a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771\": container with ID starting with a02cd3aa3d1b07051abe820550ff1ed045f0f091edf2c855e0d900df5eed3771 not found: ID does not exist" Mar 13 10:42:33.066363 master-0 kubenswrapper[17876]: I0313 10:42:33.066295 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.066904 master-0 kubenswrapper[17876]: I0313 10:42:33.066861 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.066967 master-0 kubenswrapper[17876]: I0313 10:42:33.066915 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.066967 master-0 kubenswrapper[17876]: I0313 10:42:33.066926 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:33.066967 master-0 kubenswrapper[17876]: I0313 10:42:33.066961 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067078 master-0 kubenswrapper[17876]: I0313 10:42:33.066992 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067078 master-0 kubenswrapper[17876]: I0313 10:42:33.067015 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067078 master-0 kubenswrapper[17876]: I0313 10:42:33.067050 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067248 master-0 kubenswrapper[17876]: I0313 10:42:33.067105 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hfjxf\" (UniqueName: \"kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067248 master-0 kubenswrapper[17876]: I0313 10:42:33.067135 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") pod \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\" (UID: \"f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9\") " Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067314 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067374 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067382 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067459 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067487 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067516 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067539 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067562 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8fwn\" (UniqueName: \"kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067611 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067640 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.067689 master-0 kubenswrapper[17876]: I0313 10:42:33.067661 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067755 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067809 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067863 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067920 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067937 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067949 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067955 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.067992 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.068007 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068049 master-0 kubenswrapper[17876]: I0313 10:42:33.068032 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.068369 master-0 kubenswrapper[17876]: I0313 10:42:33.068045 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:33.069897 master-0 kubenswrapper[17876]: I0313 10:42:33.069855 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf" (OuterVolumeSpecName: "kube-api-access-hfjxf") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "kube-api-access-hfjxf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:42:33.069989 master-0 kubenswrapper[17876]: I0313 10:42:33.069907 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:33.070340 master-0 kubenswrapper[17876]: I0313 10:42:33.070316 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:33.070808 master-0 kubenswrapper[17876]: I0313 10:42:33.070780 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:33.071173 master-0 kubenswrapper[17876]: I0313 10:42:33.071140 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" (UID: "f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:33.127826 master-0 kubenswrapper[17876]: I0313 10:42:33.127779 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 13 10:42:33.141064 master-0 kubenswrapper[17876]: W0313 10:42:33.141021 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod6bb1dcdd_03f9_4a09_868f_c574cd2e13ab.slice/crio-273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3 WatchSource:0}: Error finding container 273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3: Status 404 returned error can't find the container with id 273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3 Mar 13 10:42:33.168755 master-0 kubenswrapper[17876]: I0313 10:42:33.168697 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.168884 master-0 kubenswrapper[17876]: I0313 10:42:33.168789 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.168884 master-0 kubenswrapper[17876]: I0313 10:42:33.168823 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.168973 master-0 kubenswrapper[17876]: I0313 10:42:33.168914 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.168973 master-0 kubenswrapper[17876]: I0313 10:42:33.168953 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169054 master-0 kubenswrapper[17876]: I0313 10:42:33.168975 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8fwn\" (UniqueName: \"kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169054 master-0 kubenswrapper[17876]: I0313 10:42:33.169003 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169121 master-0 kubenswrapper[17876]: I0313 10:42:33.169048 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169121 master-0 kubenswrapper[17876]: I0313 10:42:33.169076 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169205 master-0 kubenswrapper[17876]: I0313 10:42:33.169121 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169205 master-0 kubenswrapper[17876]: I0313 10:42:33.169146 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169205 master-0 kubenswrapper[17876]: I0313 10:42:33.169179 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169207 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169257 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169272 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169286 17876 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169302 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169321 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169334 master-0 kubenswrapper[17876]: I0313 10:42:33.169336 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hfjxf\" (UniqueName: \"kubernetes.io/projected/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-kube-api-access-hfjxf\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.169572 master-0 kubenswrapper[17876]: I0313 10:42:33.169350 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:33.171185 master-0 kubenswrapper[17876]: I0313 10:42:33.171129 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.171388 master-0 kubenswrapper[17876]: I0313 10:42:33.171275 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.172005 master-0 kubenswrapper[17876]: I0313 10:42:33.171963 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.172077 master-0 kubenswrapper[17876]: I0313 10:42:33.172016 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.172470 master-0 kubenswrapper[17876]: I0313 10:42:33.172441 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.172853 master-0 kubenswrapper[17876]: I0313 10:42:33.172829 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.173881 master-0 kubenswrapper[17876]: I0313 10:42:33.173855 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.174936 master-0 kubenswrapper[17876]: I0313 10:42:33.174889 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.175332 master-0 kubenswrapper[17876]: I0313 10:42:33.175289 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.175332 master-0 kubenswrapper[17876]: I0313 10:42:33.175327 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.177924 master-0 kubenswrapper[17876]: I0313 10:42:33.177863 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.178703 master-0 kubenswrapper[17876]: I0313 10:42:33.178677 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.190558 master-0 kubenswrapper[17876]: I0313 10:42:33.190503 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8fwn\" (UniqueName: \"kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn\") pod \"oauth-openshift-65f5b9dbcc-62t45\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.308296 master-0 kubenswrapper[17876]: I0313 10:42:33.308237 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:33.355326 master-0 kubenswrapper[17876]: I0313 10:42:33.355053 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:42:33.386991 master-0 kubenswrapper[17876]: I0313 10:42:33.386822 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-5db65d9766-lg686"] Mar 13 10:42:33.780890 master-0 kubenswrapper[17876]: I0313 10:42:33.780677 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-65f5b9dbcc-62t45"] Mar 13 10:42:33.790851 master-0 kubenswrapper[17876]: W0313 10:42:33.790791 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cfba211_2658_42fe_ac6b_6b6cba002b99.slice/crio-2c3c6c2635030ddb4fb4810666d13cc2ee4a1ad8653c62eed8f5a0f4b3d61fa9 WatchSource:0}: Error finding container 2c3c6c2635030ddb4fb4810666d13cc2ee4a1ad8653c62eed8f5a0f4b3d61fa9: Status 404 returned error can't find the container with id 2c3c6c2635030ddb4fb4810666d13cc2ee4a1ad8653c62eed8f5a0f4b3d61fa9 Mar 13 10:42:34.002494 master-0 kubenswrapper[17876]: I0313 10:42:34.002280 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" event={"ID":"8cfba211-2658-42fe-ac6b-6b6cba002b99","Type":"ContainerStarted","Data":"2c3c6c2635030ddb4fb4810666d13cc2ee4a1ad8653c62eed8f5a0f4b3d61fa9"} Mar 13 10:42:34.004285 master-0 kubenswrapper[17876]: I0313 10:42:34.004222 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab","Type":"ContainerStarted","Data":"e9348f808b26e99ad40729b8262d004e235ea5263723a6967dd0fbdf746974dc"} Mar 13 10:42:34.004285 master-0 kubenswrapper[17876]: I0313 10:42:34.004246 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab","Type":"ContainerStarted","Data":"273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3"} Mar 13 10:42:34.507285 master-0 kubenswrapper[17876]: I0313 10:42:34.507231 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9" path="/var/lib/kubelet/pods/f6490f9f-b03e-4b2f-a8e9-56a9cfdb18f9/volumes" Mar 13 10:42:35.018382 master-0 kubenswrapper[17876]: I0313 10:42:35.018309 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" event={"ID":"8cfba211-2658-42fe-ac6b-6b6cba002b99","Type":"ContainerStarted","Data":"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b"} Mar 13 10:42:35.018382 master-0 kubenswrapper[17876]: I0313 10:42:35.018363 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:35.026818 master-0 kubenswrapper[17876]: I0313 10:42:35.026747 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:42:35.046220 master-0 kubenswrapper[17876]: I0313 10:42:35.043666 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=3.043641125 podStartE2EDuration="3.043641125s" podCreationTimestamp="2026-03-13 10:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:34.027987878 +0000 UTC m=+61.863794364" watchObservedRunningTime="2026-03-13 10:42:35.043641125 +0000 UTC m=+62.879447631" Mar 13 10:42:35.046220 master-0 kubenswrapper[17876]: I0313 10:42:35.044766 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" podStartSLOduration=33.044757687 podStartE2EDuration="33.044757687s" podCreationTimestamp="2026-03-13 10:42:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:35.040945107 +0000 UTC m=+62.876751623" watchObservedRunningTime="2026-03-13 10:42:35.044757687 +0000 UTC m=+62.880564193" Mar 13 10:42:37.208863 master-0 kubenswrapper[17876]: I0313 10:42:37.208807 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:42:37.209978 master-0 kubenswrapper[17876]: I0313 10:42:37.209748 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" containerID="cri-o://54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621" gracePeriod=30 Mar 13 10:42:37.241500 master-0 kubenswrapper[17876]: I0313 10:42:37.241435 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:42:37.241783 master-0 kubenswrapper[17876]: I0313 10:42:37.241702 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerName="route-controller-manager" containerID="cri-o://c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b" gracePeriod=30 Mar 13 10:42:37.711800 master-0 kubenswrapper[17876]: I0313 10:42:37.711729 17876 patch_prober.go:28] interesting pod/route-controller-manager-9f8f9b5c9-pjljl container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Mar 13 10:42:37.711800 master-0 kubenswrapper[17876]: I0313 10:42:37.711781 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Mar 13 10:42:38.342703 master-0 kubenswrapper[17876]: I0313 10:42:38.334228 17876 patch_prober.go:28] interesting pod/controller-manager-79847c4f97-tf57f container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" start-of-body= Mar 13 10:42:38.342703 master-0 kubenswrapper[17876]: I0313 10:42:38.334295 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.55:8443/healthz\": dial tcp 10.128.0.55:8443: connect: connection refused" Mar 13 10:42:40.331365 master-0 kubenswrapper[17876]: I0313 10:42:40.331315 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:42:40.331936 master-0 kubenswrapper[17876]: I0313 10:42:40.331376 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:42:42.738218 master-0 kubenswrapper[17876]: I0313 10:42:42.738137 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:42:42.738815 master-0 kubenswrapper[17876]: I0313 10:42:42.738239 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:42:47.119247 master-0 kubenswrapper[17876]: I0313 10:42:47.116527 17876 generic.go:334] "Generic (PLEG): container finished" podID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerID="54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621" exitCode=0 Mar 13 10:42:47.119247 master-0 kubenswrapper[17876]: I0313 10:42:47.116636 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerDied","Data":"54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621"} Mar 13 10:42:47.119247 master-0 kubenswrapper[17876]: I0313 10:42:47.116683 17876 scope.go:117] "RemoveContainer" containerID="34f271f240a5a92d84425b4acb8e33c675ab8a355af9a316345e90eee5490104" Mar 13 10:42:47.133463 master-0 kubenswrapper[17876]: I0313 10:42:47.133311 17876 generic.go:334] "Generic (PLEG): container finished" podID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerID="c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b" exitCode=0 Mar 13 10:42:47.133463 master-0 kubenswrapper[17876]: I0313 10:42:47.133376 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerDied","Data":"c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b"} Mar 13 10:42:47.711205 master-0 kubenswrapper[17876]: I0313 10:42:47.708955 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:42:47.825875 master-0 kubenswrapper[17876]: I0313 10:42:47.825843 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:42:47.904633 master-0 kubenswrapper[17876]: I0313 10:42:47.904515 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") pod \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " Mar 13 10:42:47.904960 master-0 kubenswrapper[17876]: I0313 10:42:47.904655 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") pod \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " Mar 13 10:42:47.904960 master-0 kubenswrapper[17876]: I0313 10:42:47.904762 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") pod \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " Mar 13 10:42:47.905046 master-0 kubenswrapper[17876]: I0313 10:42:47.905015 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") pod \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\" (UID: \"21bb85e2-0d4a-418f-a7c9-482e8eafce19\") " Mar 13 10:42:47.905422 master-0 kubenswrapper[17876]: I0313 10:42:47.905368 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca" (OuterVolumeSpecName: "client-ca") pod "21bb85e2-0d4a-418f-a7c9-482e8eafce19" (UID: "21bb85e2-0d4a-418f-a7c9-482e8eafce19"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:47.905484 master-0 kubenswrapper[17876]: I0313 10:42:47.905423 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config" (OuterVolumeSpecName: "config") pod "21bb85e2-0d4a-418f-a7c9-482e8eafce19" (UID: "21bb85e2-0d4a-418f-a7c9-482e8eafce19"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:47.905571 master-0 kubenswrapper[17876]: I0313 10:42:47.905531 17876 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:47.910585 master-0 kubenswrapper[17876]: I0313 10:42:47.910541 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "21bb85e2-0d4a-418f-a7c9-482e8eafce19" (UID: "21bb85e2-0d4a-418f-a7c9-482e8eafce19"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:47.910720 master-0 kubenswrapper[17876]: I0313 10:42:47.910697 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt" (OuterVolumeSpecName: "kube-api-access-xl7xt") pod "21bb85e2-0d4a-418f-a7c9-482e8eafce19" (UID: "21bb85e2-0d4a-418f-a7c9-482e8eafce19"). InnerVolumeSpecName "kube-api-access-xl7xt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:42:48.007808 master-0 kubenswrapper[17876]: I0313 10:42:48.007642 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") pod \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " Mar 13 10:42:48.007808 master-0 kubenswrapper[17876]: I0313 10:42:48.007728 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") pod \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " Mar 13 10:42:48.007808 master-0 kubenswrapper[17876]: I0313 10:42:48.007762 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") pod \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " Mar 13 10:42:48.008308 master-0 kubenswrapper[17876]: I0313 10:42:48.007808 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") pod \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " Mar 13 10:42:48.008308 master-0 kubenswrapper[17876]: I0313 10:42:48.007921 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") pod \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\" (UID: \"3bf5e05a-443b-41dc-b464-3d2f1ace50a0\") " Mar 13 10:42:48.008308 master-0 kubenswrapper[17876]: I0313 10:42:48.008257 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xl7xt\" (UniqueName: \"kubernetes.io/projected/21bb85e2-0d4a-418f-a7c9-482e8eafce19-kube-api-access-xl7xt\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.008308 master-0 kubenswrapper[17876]: I0313 10:42:48.008276 17876 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/21bb85e2-0d4a-418f-a7c9-482e8eafce19-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.008308 master-0 kubenswrapper[17876]: I0313 10:42:48.008288 17876 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/21bb85e2-0d4a-418f-a7c9-482e8eafce19-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.009156 master-0 kubenswrapper[17876]: I0313 10:42:48.009060 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3bf5e05a-443b-41dc-b464-3d2f1ace50a0" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:48.009277 master-0 kubenswrapper[17876]: I0313 10:42:48.009196 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca" (OuterVolumeSpecName: "client-ca") pod "3bf5e05a-443b-41dc-b464-3d2f1ace50a0" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:48.009344 master-0 kubenswrapper[17876]: I0313 10:42:48.009247 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config" (OuterVolumeSpecName: "config") pod "3bf5e05a-443b-41dc-b464-3d2f1ace50a0" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:42:48.014349 master-0 kubenswrapper[17876]: I0313 10:42:48.014293 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6" (OuterVolumeSpecName: "kube-api-access-4xqz6") pod "3bf5e05a-443b-41dc-b464-3d2f1ace50a0" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0"). InnerVolumeSpecName "kube-api-access-4xqz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:42:48.014448 master-0 kubenswrapper[17876]: I0313 10:42:48.014331 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3bf5e05a-443b-41dc-b464-3d2f1ace50a0" (UID: "3bf5e05a-443b-41dc-b464-3d2f1ace50a0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:42:48.109241 master-0 kubenswrapper[17876]: I0313 10:42:48.109145 17876 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.109241 master-0 kubenswrapper[17876]: I0313 10:42:48.109192 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4xqz6\" (UniqueName: \"kubernetes.io/projected/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-kube-api-access-4xqz6\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.109241 master-0 kubenswrapper[17876]: I0313 10:42:48.109203 17876 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.109241 master-0 kubenswrapper[17876]: I0313 10:42:48.109211 17876 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.109241 master-0 kubenswrapper[17876]: I0313 10:42:48.109220 17876 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bf5e05a-443b-41dc-b464-3d2f1ace50a0-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:42:48.141934 master-0 kubenswrapper[17876]: I0313 10:42:48.141857 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" Mar 13 10:42:48.142538 master-0 kubenswrapper[17876]: I0313 10:42:48.141860 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79847c4f97-tf57f" event={"ID":"3bf5e05a-443b-41dc-b464-3d2f1ace50a0","Type":"ContainerDied","Data":"5f4e5674ade432e52f9563a1f07684d2d9624c5df1e6b8e0fa3c971d3c078df8"} Mar 13 10:42:48.142538 master-0 kubenswrapper[17876]: I0313 10:42:48.142022 17876 scope.go:117] "RemoveContainer" containerID="54da7b3235abaa116243b07a8cea7e97784d45d4d84871349e58b575ce64f621" Mar 13 10:42:48.144706 master-0 kubenswrapper[17876]: I0313 10:42:48.144593 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" event={"ID":"21bb85e2-0d4a-418f-a7c9-482e8eafce19","Type":"ContainerDied","Data":"7623887564e1fd29b1c01e5d18c6715a43b71a693407bef1bea029e2735f11dd"} Mar 13 10:42:48.144706 master-0 kubenswrapper[17876]: I0313 10:42:48.144698 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl" Mar 13 10:42:48.160213 master-0 kubenswrapper[17876]: I0313 10:42:48.160171 17876 scope.go:117] "RemoveContainer" containerID="c81f55f61228604f6223600595f4c2e8e2f4dceb06b2bdad97b4839cb1807b1b" Mar 13 10:42:48.924524 master-0 kubenswrapper[17876]: I0313 10:42:48.924423 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv"] Mar 13 10:42:48.924771 master-0 kubenswrapper[17876]: E0313 10:42:48.924749 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:48.924831 master-0 kubenswrapper[17876]: I0313 10:42:48.924770 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:48.924831 master-0 kubenswrapper[17876]: E0313 10:42:48.924815 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerName="route-controller-manager" Mar 13 10:42:48.924831 master-0 kubenswrapper[17876]: I0313 10:42:48.924825 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerName="route-controller-manager" Mar 13 10:42:48.925008 master-0 kubenswrapper[17876]: I0313 10:42:48.924987 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:48.925041 master-0 kubenswrapper[17876]: I0313 10:42:48.925009 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" containerName="route-controller-manager" Mar 13 10:42:48.925041 master-0 kubenswrapper[17876]: I0313 10:42:48.925025 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:48.925572 master-0 kubenswrapper[17876]: I0313 10:42:48.925544 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:48.927866 master-0 kubenswrapper[17876]: I0313 10:42:48.927839 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-2p9p4" Mar 13 10:42:48.929299 master-0 kubenswrapper[17876]: I0313 10:42:48.928060 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:42:48.929299 master-0 kubenswrapper[17876]: I0313 10:42:48.928222 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:42:48.929299 master-0 kubenswrapper[17876]: I0313 10:42:48.928338 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:42:48.929299 master-0 kubenswrapper[17876]: I0313 10:42:48.928440 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:42:48.929299 master-0 kubenswrapper[17876]: I0313 10:42:48.928575 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:42:49.023777 master-0 kubenswrapper[17876]: I0313 10:42:49.023646 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-client-ca\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.024130 master-0 kubenswrapper[17876]: I0313 10:42:49.024075 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-config\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.024234 master-0 kubenswrapper[17876]: I0313 10:42:49.024220 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a4a18e-2256-4c27-9953-1a9dca3926d6-serving-cert\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.024382 master-0 kubenswrapper[17876]: I0313 10:42:49.024365 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5845w\" (UniqueName: \"kubernetes.io/projected/86a4a18e-2256-4c27-9953-1a9dca3926d6-kube-api-access-5845w\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.125731 master-0 kubenswrapper[17876]: I0313 10:42:49.125666 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-client-ca\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.126283 master-0 kubenswrapper[17876]: I0313 10:42:49.126250 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-config\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.126490 master-0 kubenswrapper[17876]: I0313 10:42:49.126461 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a4a18e-2256-4c27-9953-1a9dca3926d6-serving-cert\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.126724 master-0 kubenswrapper[17876]: I0313 10:42:49.126692 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5845w\" (UniqueName: \"kubernetes.io/projected/86a4a18e-2256-4c27-9953-1a9dca3926d6-kube-api-access-5845w\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.127273 master-0 kubenswrapper[17876]: I0313 10:42:49.127214 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-client-ca\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.128697 master-0 kubenswrapper[17876]: I0313 10:42:49.128638 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/86a4a18e-2256-4c27-9953-1a9dca3926d6-config\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.131353 master-0 kubenswrapper[17876]: I0313 10:42:49.131292 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/86a4a18e-2256-4c27-9953-1a9dca3926d6-serving-cert\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.584411 master-0 kubenswrapper[17876]: I0313 10:42:49.584299 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv"] Mar 13 10:42:49.594532 master-0 kubenswrapper[17876]: I0313 10:42:49.594452 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5845w\" (UniqueName: \"kubernetes.io/projected/86a4a18e-2256-4c27-9953-1a9dca3926d6-kube-api-access-5845w\") pod \"route-controller-manager-69c7cffc4c-7h7mv\" (UID: \"86a4a18e-2256-4c27-9953-1a9dca3926d6\") " pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:49.846469 master-0 kubenswrapper[17876]: I0313 10:42:49.846112 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:50.087335 master-0 kubenswrapper[17876]: I0313 10:42:50.087250 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:42:50.092487 master-0 kubenswrapper[17876]: I0313 10:42:50.092386 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79847c4f97-tf57f"] Mar 13 10:42:50.330939 master-0 kubenswrapper[17876]: I0313 10:42:50.330821 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:42:50.330939 master-0 kubenswrapper[17876]: I0313 10:42:50.330910 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:42:50.505988 master-0 kubenswrapper[17876]: I0313 10:42:50.505887 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" path="/var/lib/kubelet/pods/3bf5e05a-443b-41dc-b464-3d2f1ace50a0/volumes" Mar 13 10:42:51.013994 master-0 kubenswrapper[17876]: I0313 10:42:51.013892 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:42:51.974732 master-0 kubenswrapper[17876]: I0313 10:42:51.974612 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-9f8f9b5c9-pjljl"] Mar 13 10:42:51.977350 master-0 kubenswrapper[17876]: I0313 10:42:51.977294 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5dcc5796bd-kgx74"] Mar 13 10:42:51.977599 master-0 kubenswrapper[17876]: E0313 10:42:51.977559 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:51.977599 master-0 kubenswrapper[17876]: I0313 10:42:51.977583 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bf5e05a-443b-41dc-b464-3d2f1ace50a0" containerName="controller-manager" Mar 13 10:42:51.978265 master-0 kubenswrapper[17876]: I0313 10:42:51.978212 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:51.983867 master-0 kubenswrapper[17876]: I0313 10:42:51.983825 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-jckzr" Mar 13 10:42:51.984126 master-0 kubenswrapper[17876]: I0313 10:42:51.984062 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:42:51.984359 master-0 kubenswrapper[17876]: I0313 10:42:51.984329 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:42:51.984762 master-0 kubenswrapper[17876]: I0313 10:42:51.984724 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:42:51.985611 master-0 kubenswrapper[17876]: I0313 10:42:51.985566 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:42:51.988017 master-0 kubenswrapper[17876]: I0313 10:42:51.987984 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:42:51.993080 master-0 kubenswrapper[17876]: I0313 10:42:51.992998 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:42:52.079772 master-0 kubenswrapper[17876]: I0313 10:42:52.079586 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-client-ca\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.079772 master-0 kubenswrapper[17876]: I0313 10:42:52.079760 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-config\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.080878 master-0 kubenswrapper[17876]: I0313 10:42:52.079862 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-proxy-ca-bundles\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.080878 master-0 kubenswrapper[17876]: I0313 10:42:52.080003 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdhl7\" (UniqueName: \"kubernetes.io/projected/1b7e4f08-d451-4e67-8472-4de6270ee72c-kube-api-access-wdhl7\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.080878 master-0 kubenswrapper[17876]: I0313 10:42:52.080261 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7e4f08-d451-4e67-8472-4de6270ee72c-serving-cert\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.196540 master-0 kubenswrapper[17876]: I0313 10:42:52.196424 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-client-ca\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.196540 master-0 kubenswrapper[17876]: I0313 10:42:52.194454 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-client-ca\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.197005 master-0 kubenswrapper[17876]: I0313 10:42:52.196601 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-config\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.199719 master-0 kubenswrapper[17876]: I0313 10:42:52.199632 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-proxy-ca-bundles\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.199719 master-0 kubenswrapper[17876]: I0313 10:42:52.199698 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wdhl7\" (UniqueName: \"kubernetes.io/projected/1b7e4f08-d451-4e67-8472-4de6270ee72c-kube-api-access-wdhl7\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.200007 master-0 kubenswrapper[17876]: I0313 10:42:52.199873 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7e4f08-d451-4e67-8472-4de6270ee72c-serving-cert\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.200931 master-0 kubenswrapper[17876]: I0313 10:42:52.199462 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-config\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.205253 master-0 kubenswrapper[17876]: I0313 10:42:52.205179 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b7e4f08-d451-4e67-8472-4de6270ee72c-proxy-ca-bundles\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.205836 master-0 kubenswrapper[17876]: I0313 10:42:52.205796 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b7e4f08-d451-4e67-8472-4de6270ee72c-serving-cert\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.236603 master-0 kubenswrapper[17876]: I0313 10:42:52.236451 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-pgdzh" event={"ID":"3db4d656-fd93-41a2-8e49-0d1685c4e3d3","Type":"ContainerStarted","Data":"01f1f0081abcaaf0fadf4f09c2abb77cf2071522e2505adce1edc61580224a10"} Mar 13 10:42:52.236787 master-0 kubenswrapper[17876]: I0313 10:42:52.236730 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:42:52.238325 master-0 kubenswrapper[17876]: I0313 10:42:52.238281 17876 patch_prober.go:28] interesting pod/downloads-84f57b9877-pgdzh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.83:8080/\": dial tcp 10.128.0.83:8080: connect: connection refused" start-of-body= Mar 13 10:42:52.238430 master-0 kubenswrapper[17876]: I0313 10:42:52.238330 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-pgdzh" podUID="3db4d656-fd93-41a2-8e49-0d1685c4e3d3" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.83:8080/\": dial tcp 10.128.0.83:8080: connect: connection refused" Mar 13 10:42:52.332699 master-0 kubenswrapper[17876]: I0313 10:42:52.331327 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dcc5796bd-kgx74"] Mar 13 10:42:52.340120 master-0 kubenswrapper[17876]: I0313 10:42:52.338865 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv"] Mar 13 10:42:52.348574 master-0 kubenswrapper[17876]: I0313 10:42:52.348537 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdhl7\" (UniqueName: \"kubernetes.io/projected/1b7e4f08-d451-4e67-8472-4de6270ee72c-kube-api-access-wdhl7\") pod \"controller-manager-5dcc5796bd-kgx74\" (UID: \"1b7e4f08-d451-4e67-8472-4de6270ee72c\") " pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.503933 master-0 kubenswrapper[17876]: I0313 10:42:52.503871 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21bb85e2-0d4a-418f-a7c9-482e8eafce19" path="/var/lib/kubelet/pods/21bb85e2-0d4a-418f-a7c9-482e8eafce19/volumes" Mar 13 10:42:52.598008 master-0 kubenswrapper[17876]: I0313 10:42:52.597897 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:52.737726 master-0 kubenswrapper[17876]: I0313 10:42:52.737624 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:42:52.737726 master-0 kubenswrapper[17876]: I0313 10:42:52.737702 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:42:53.249387 master-0 kubenswrapper[17876]: I0313 10:42:53.249120 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" event={"ID":"86a4a18e-2256-4c27-9953-1a9dca3926d6","Type":"ContainerStarted","Data":"29b7f8122b28d14392b2329bcad49a6fb43b2112d07f9c58515875136d5160c6"} Mar 13 10:42:53.249387 master-0 kubenswrapper[17876]: I0313 10:42:53.249261 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" event={"ID":"86a4a18e-2256-4c27-9953-1a9dca3926d6","Type":"ContainerStarted","Data":"c6742c240ad24fcb49cafc3b8bf67308169b0cd4cd047768187c99e0531219bf"} Mar 13 10:42:53.249387 master-0 kubenswrapper[17876]: I0313 10:42:53.249305 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:53.253001 master-0 kubenswrapper[17876]: I0313 10:42:53.252902 17876 patch_prober.go:28] interesting pod/downloads-84f57b9877-pgdzh container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.83:8080/\": dial tcp 10.128.0.83:8080: connect: connection refused" start-of-body= Mar 13 10:42:53.253179 master-0 kubenswrapper[17876]: I0313 10:42:53.253024 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-pgdzh" podUID="3db4d656-fd93-41a2-8e49-0d1685c4e3d3" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.83:8080/\": dial tcp 10.128.0.83:8080: connect: connection refused" Mar 13 10:42:53.285727 master-0 kubenswrapper[17876]: I0313 10:42:53.281929 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-pgdzh" podStartSLOduration=3.98696613 podStartE2EDuration="46.281901321s" podCreationTimestamp="2026-03-13 10:42:07 +0000 UTC" firstStartedPulling="2026-03-13 10:42:08.352987821 +0000 UTC m=+36.188794297" lastFinishedPulling="2026-03-13 10:42:50.647923022 +0000 UTC m=+78.483729488" observedRunningTime="2026-03-13 10:42:53.280517461 +0000 UTC m=+81.116323947" watchObservedRunningTime="2026-03-13 10:42:53.281901321 +0000 UTC m=+81.117707807" Mar 13 10:42:53.285727 master-0 kubenswrapper[17876]: I0313 10:42:53.284022 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5dcc5796bd-kgx74"] Mar 13 10:42:53.457385 master-0 kubenswrapper[17876]: I0313 10:42:53.457333 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:42:53.498026 master-0 kubenswrapper[17876]: I0313 10:42:53.497925 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" podStartSLOduration=16.497893815 podStartE2EDuration="16.497893815s" podCreationTimestamp="2026-03-13 10:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:53.428216321 +0000 UTC m=+81.264022807" watchObservedRunningTime="2026-03-13 10:42:53.497893815 +0000 UTC m=+81.333700291" Mar 13 10:42:53.506082 master-0 kubenswrapper[17876]: I0313 10:42:53.505924 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-65f5b9dbcc-62t45"] Mar 13 10:42:54.255197 master-0 kubenswrapper[17876]: I0313 10:42:54.255115 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" event={"ID":"1b7e4f08-d451-4e67-8472-4de6270ee72c","Type":"ContainerStarted","Data":"6932138c1740c44c27e9e781bf8fa7fcf05f0501eb723d550fb039d9e4b714bb"} Mar 13 10:42:54.255197 master-0 kubenswrapper[17876]: I0313 10:42:54.255159 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" event={"ID":"1b7e4f08-d451-4e67-8472-4de6270ee72c","Type":"ContainerStarted","Data":"d0342838f8ede63934f0f3ba1463729171b37f1d1d9314f7b6c4d4a5d9aae5ee"} Mar 13 10:42:54.255851 master-0 kubenswrapper[17876]: I0313 10:42:54.255479 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:54.262283 master-0 kubenswrapper[17876]: I0313 10:42:54.262231 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:42:54.298686 master-0 kubenswrapper[17876]: I0313 10:42:54.298608 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" podStartSLOduration=17.29859038 podStartE2EDuration="17.29859038s" podCreationTimestamp="2026-03-13 10:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:42:54.277641781 +0000 UTC m=+82.113448307" watchObservedRunningTime="2026-03-13 10:42:54.29859038 +0000 UTC m=+82.134396856" Mar 13 10:42:57.835149 master-0 kubenswrapper[17876]: I0313 10:42:57.835066 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-pgdzh" Mar 13 10:43:00.331905 master-0 kubenswrapper[17876]: I0313 10:43:00.331808 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:00.332516 master-0 kubenswrapper[17876]: I0313 10:43:00.331898 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:02.194707 master-0 kubenswrapper[17876]: I0313 10:43:02.194625 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:43:02.200509 master-0 kubenswrapper[17876]: I0313 10:43:02.200411 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " pod="openshift-kube-apiserver/installer-2-retry-1-master-0" Mar 13 10:43:02.397757 master-0 kubenswrapper[17876]: I0313 10:43:02.397704 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") pod \"3b44838d-cfe0-42fe-9927-d0b5391eee81\" (UID: \"3b44838d-cfe0-42fe-9927-d0b5391eee81\") " Mar 13 10:43:02.400574 master-0 kubenswrapper[17876]: I0313 10:43:02.400505 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3b44838d-cfe0-42fe-9927-d0b5391eee81" (UID: "3b44838d-cfe0-42fe-9927-d0b5391eee81"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:43:02.499448 master-0 kubenswrapper[17876]: I0313 10:43:02.499330 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3b44838d-cfe0-42fe-9927-d0b5391eee81-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:02.738089 master-0 kubenswrapper[17876]: I0313 10:43:02.738009 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:02.738380 master-0 kubenswrapper[17876]: I0313 10:43:02.738112 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:10.330591 master-0 kubenswrapper[17876]: I0313 10:43:10.330473 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:10.330591 master-0 kubenswrapper[17876]: I0313 10:43:10.330546 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:11.388270 master-0 kubenswrapper[17876]: I0313 10:43:11.388097 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:43:11.389412 master-0 kubenswrapper[17876]: I0313 10:43:11.389175 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.408167 master-0 kubenswrapper[17876]: I0313 10:43:11.408135 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:43:11.408608 master-0 kubenswrapper[17876]: I0313 10:43:11.408571 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12" gracePeriod=15 Mar 13 10:43:11.408659 master-0 kubenswrapper[17876]: I0313 10:43:11.408585 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-check-endpoints" containerID="cri-o://5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc" gracePeriod=15 Mar 13 10:43:11.408659 master-0 kubenswrapper[17876]: I0313 10:43:11.408627 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-syncer" containerID="cri-o://f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6" gracePeriod=15 Mar 13 10:43:11.408747 master-0 kubenswrapper[17876]: I0313 10:43:11.408724 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver" containerID="cri-o://ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f" gracePeriod=15 Mar 13 10:43:11.408800 master-0 kubenswrapper[17876]: I0313 10:43:11.408548 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2" gracePeriod=15 Mar 13 10:43:11.409840 master-0 kubenswrapper[17876]: I0313 10:43:11.409549 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:43:11.409907 master-0 kubenswrapper[17876]: E0313 10:43:11.409868 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:43:11.409907 master-0 kubenswrapper[17876]: I0313 10:43:11.409894 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: E0313 10:43:11.409919 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-check-endpoints" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: I0313 10:43:11.409929 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-check-endpoints" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: E0313 10:43:11.409952 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-syncer" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: I0313 10:43:11.409961 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-syncer" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: E0313 10:43:11.409974 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-insecure-readyz" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: I0313 10:43:11.409982 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-insecure-readyz" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: E0313 10:43:11.409996 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="setup" Mar 13 10:43:11.410007 master-0 kubenswrapper[17876]: I0313 10:43:11.410004 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="setup" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: E0313 10:43:11.410018 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410026 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410193 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410237 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-check-endpoints" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410249 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-insecure-readyz" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410269 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410276 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="setup" Mar 13 10:43:11.410455 master-0 kubenswrapper[17876]: I0313 10:43:11.410286 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c3280e9367536f782caf8bdc07edb85" containerName="kube-apiserver-cert-syncer" Mar 13 10:43:11.448215 master-0 kubenswrapper[17876]: I0313 10:43:11.447612 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:43:11.477113 master-0 kubenswrapper[17876]: I0313 10:43:11.477040 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.477371 master-0 kubenswrapper[17876]: I0313 10:43:11.477320 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.477371 master-0 kubenswrapper[17876]: I0313 10:43:11.477357 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.477494 master-0 kubenswrapper[17876]: I0313 10:43:11.477383 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.477494 master-0 kubenswrapper[17876]: I0313 10:43:11.477404 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.477736 master-0 kubenswrapper[17876]: I0313 10:43:11.477677 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.477819 master-0 kubenswrapper[17876]: I0313 10:43:11.477781 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.477908 master-0 kubenswrapper[17876]: I0313 10:43:11.477883 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.578377 master-0 kubenswrapper[17876]: I0313 10:43:11.578322 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.578377 master-0 kubenswrapper[17876]: I0313 10:43:11.578385 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.578753 master-0 kubenswrapper[17876]: I0313 10:43:11.578721 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.578815 master-0 kubenswrapper[17876]: I0313 10:43:11.578747 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.578880 master-0 kubenswrapper[17876]: I0313 10:43:11.578818 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.578939 master-0 kubenswrapper[17876]: I0313 10:43:11.578882 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.578939 master-0 kubenswrapper[17876]: I0313 10:43:11.578925 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.579046 master-0 kubenswrapper[17876]: I0313 10:43:11.578916 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.579046 master-0 kubenswrapper[17876]: I0313 10:43:11.578956 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.579046 master-0 kubenswrapper[17876]: I0313 10:43:11.578968 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.579046 master-0 kubenswrapper[17876]: I0313 10:43:11.579002 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.579296 master-0 kubenswrapper[17876]: I0313 10:43:11.579067 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.579296 master-0 kubenswrapper[17876]: I0313 10:43:11.579079 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.579296 master-0 kubenswrapper[17876]: I0313 10:43:11.579063 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"077dd10388b9e3e48a07382126e86621\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:11.579448 master-0 kubenswrapper[17876]: I0313 10:43:11.579300 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.579448 master-0 kubenswrapper[17876]: I0313 10:43:11.579418 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.740613 master-0 kubenswrapper[17876]: I0313 10:43:11.740436 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:11.761466 master-0 kubenswrapper[17876]: W0313 10:43:11.761398 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod899242a15b2bdf3b4a04fb323647ca94.slice/crio-1570cfbf47461df2f4d3fe410c1c775ba44b47e5aff3b1eadb939345aba8bfde WatchSource:0}: Error finding container 1570cfbf47461df2f4d3fe410c1c775ba44b47e5aff3b1eadb939345aba8bfde: Status 404 returned error can't find the container with id 1570cfbf47461df2f4d3fe410c1c775ba44b47e5aff3b1eadb939345aba8bfde Mar 13 10:43:11.764271 master-0 kubenswrapper[17876]: E0313 10:43:11.764083 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c609f93369600 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:899242a15b2bdf3b4a04fb323647ca94,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:43:11.763027456 +0000 UTC m=+99.598833922,LastTimestamp:2026-03-13 10:43:11.763027456 +0000 UTC m=+99.598833922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:43:12.393010 master-0 kubenswrapper[17876]: I0313 10:43:12.392962 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_4c3280e9367536f782caf8bdc07edb85/kube-apiserver-cert-syncer/0.log" Mar 13 10:43:12.394792 master-0 kubenswrapper[17876]: I0313 10:43:12.394688 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc" exitCode=0 Mar 13 10:43:12.394792 master-0 kubenswrapper[17876]: I0313 10:43:12.394790 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12" exitCode=0 Mar 13 10:43:12.394927 master-0 kubenswrapper[17876]: I0313 10:43:12.394805 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2" exitCode=0 Mar 13 10:43:12.394927 master-0 kubenswrapper[17876]: I0313 10:43:12.394815 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6" exitCode=2 Mar 13 10:43:12.396885 master-0 kubenswrapper[17876]: I0313 10:43:12.396812 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"f39c34196fdd75fee093e6d7425f196a9c2aea8d2fd22351895c1d6588e8828c"} Mar 13 10:43:12.396954 master-0 kubenswrapper[17876]: I0313 10:43:12.396913 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"899242a15b2bdf3b4a04fb323647ca94","Type":"ContainerStarted","Data":"1570cfbf47461df2f4d3fe410c1c775ba44b47e5aff3b1eadb939345aba8bfde"} Mar 13 10:43:12.398908 master-0 kubenswrapper[17876]: I0313 10:43:12.398881 17876 generic.go:334] "Generic (PLEG): container finished" podID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" containerID="e9348f808b26e99ad40729b8262d004e235ea5263723a6967dd0fbdf746974dc" exitCode=0 Mar 13 10:43:12.399047 master-0 kubenswrapper[17876]: I0313 10:43:12.398994 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.399157 master-0 kubenswrapper[17876]: I0313 10:43:12.398922 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab","Type":"ContainerDied","Data":"e9348f808b26e99ad40729b8262d004e235ea5263723a6967dd0fbdf746974dc"} Mar 13 10:43:12.399643 master-0 kubenswrapper[17876]: I0313 10:43:12.399611 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.400465 master-0 kubenswrapper[17876]: I0313 10:43:12.400428 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.400988 master-0 kubenswrapper[17876]: I0313 10:43:12.400953 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.401496 master-0 kubenswrapper[17876]: I0313 10:43:12.401452 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.498543 master-0 kubenswrapper[17876]: I0313 10:43:12.498469 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.499559 master-0 kubenswrapper[17876]: I0313 10:43:12.499444 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.500532 master-0 kubenswrapper[17876]: I0313 10:43:12.500466 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:12.738007 master-0 kubenswrapper[17876]: I0313 10:43:12.737774 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:12.738007 master-0 kubenswrapper[17876]: I0313 10:43:12.737925 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:13.933960 master-0 kubenswrapper[17876]: I0313 10:43:13.933900 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_4c3280e9367536f782caf8bdc07edb85/kube-apiserver-cert-syncer/0.log" Mar 13 10:43:13.934995 master-0 kubenswrapper[17876]: I0313 10:43:13.934940 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:13.936133 master-0 kubenswrapper[17876]: I0313 10:43:13.936068 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:13.936609 master-0 kubenswrapper[17876]: I0313 10:43:13.936565 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:13.937150 master-0 kubenswrapper[17876]: I0313 10:43:13.937116 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.010493 master-0 kubenswrapper[17876]: I0313 10:43:14.010380 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") pod \"4c3280e9367536f782caf8bdc07edb85\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " Mar 13 10:43:14.010493 master-0 kubenswrapper[17876]: I0313 10:43:14.010429 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") pod \"4c3280e9367536f782caf8bdc07edb85\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " Mar 13 10:43:14.010493 master-0 kubenswrapper[17876]: I0313 10:43:14.010458 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") pod \"4c3280e9367536f782caf8bdc07edb85\" (UID: \"4c3280e9367536f782caf8bdc07edb85\") " Mar 13 10:43:14.010752 master-0 kubenswrapper[17876]: I0313 10:43:14.010566 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "4c3280e9367536f782caf8bdc07edb85" (UID: "4c3280e9367536f782caf8bdc07edb85"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:14.010752 master-0 kubenswrapper[17876]: I0313 10:43:14.010566 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "4c3280e9367536f782caf8bdc07edb85" (UID: "4c3280e9367536f782caf8bdc07edb85"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:14.010752 master-0 kubenswrapper[17876]: I0313 10:43:14.010689 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.010752 master-0 kubenswrapper[17876]: I0313 10:43:14.010702 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.010752 master-0 kubenswrapper[17876]: I0313 10:43:14.010709 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "4c3280e9367536f782caf8bdc07edb85" (UID: "4c3280e9367536f782caf8bdc07edb85"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:14.046364 master-0 kubenswrapper[17876]: I0313 10:43:14.046303 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:43:14.047349 master-0 kubenswrapper[17876]: I0313 10:43:14.047300 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.047910 master-0 kubenswrapper[17876]: I0313 10:43:14.047854 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.048472 master-0 kubenswrapper[17876]: I0313 10:43:14.048396 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.111592 master-0 kubenswrapper[17876]: I0313 10:43:14.111498 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir\") pod \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " Mar 13 10:43:14.111592 master-0 kubenswrapper[17876]: I0313 10:43:14.111544 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock\") pod \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " Mar 13 10:43:14.111592 master-0 kubenswrapper[17876]: I0313 10:43:14.111604 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access\") pod \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\" (UID: \"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab\") " Mar 13 10:43:14.112319 master-0 kubenswrapper[17876]: I0313 10:43:14.111641 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" (UID: "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:14.112319 master-0 kubenswrapper[17876]: I0313 10:43:14.111676 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock" (OuterVolumeSpecName: "var-lock") pod "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" (UID: "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:14.112319 master-0 kubenswrapper[17876]: I0313 10:43:14.111978 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.112319 master-0 kubenswrapper[17876]: I0313 10:43:14.111994 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.112319 master-0 kubenswrapper[17876]: I0313 10:43:14.112003 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/4c3280e9367536f782caf8bdc07edb85-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.116439 master-0 kubenswrapper[17876]: I0313 10:43:14.116391 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" (UID: "6bb1dcdd-03f9-4a09-868f-c574cd2e13ab"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:43:14.213246 master-0 kubenswrapper[17876]: I0313 10:43:14.213185 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6bb1dcdd-03f9-4a09-868f-c574cd2e13ab-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:14.424090 master-0 kubenswrapper[17876]: I0313 10:43:14.423988 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"6bb1dcdd-03f9-4a09-868f-c574cd2e13ab","Type":"ContainerDied","Data":"273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3"} Mar 13 10:43:14.424090 master-0 kubenswrapper[17876]: I0313 10:43:14.424154 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="273287cb0b4b4109c5c3555fcccd825b7fcc83c65d60d66f9a122bd8f8599aa3" Mar 13 10:43:14.424698 master-0 kubenswrapper[17876]: I0313 10:43:14.424334 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 13 10:43:14.430006 master-0 kubenswrapper[17876]: I0313 10:43:14.429931 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_4c3280e9367536f782caf8bdc07edb85/kube-apiserver-cert-syncer/0.log" Mar 13 10:43:14.431156 master-0 kubenswrapper[17876]: I0313 10:43:14.431118 17876 generic.go:334] "Generic (PLEG): container finished" podID="4c3280e9367536f782caf8bdc07edb85" containerID="ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f" exitCode=0 Mar 13 10:43:14.431306 master-0 kubenswrapper[17876]: I0313 10:43:14.431209 17876 scope.go:117] "RemoveContainer" containerID="5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc" Mar 13 10:43:14.431455 master-0 kubenswrapper[17876]: I0313 10:43:14.431385 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:14.463501 master-0 kubenswrapper[17876]: I0313 10:43:14.463415 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.464308 master-0 kubenswrapper[17876]: I0313 10:43:14.464249 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.464969 master-0 kubenswrapper[17876]: I0313 10:43:14.464896 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.465627 master-0 kubenswrapper[17876]: I0313 10:43:14.465563 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.466913 master-0 kubenswrapper[17876]: I0313 10:43:14.466873 17876 scope.go:117] "RemoveContainer" containerID="b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12" Mar 13 10:43:14.467531 master-0 kubenswrapper[17876]: I0313 10:43:14.466964 17876 status_manager.go:851] "Failed to get status for pod" podUID="4c3280e9367536f782caf8bdc07edb85" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.468480 master-0 kubenswrapper[17876]: I0313 10:43:14.468404 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:14.482609 master-0 kubenswrapper[17876]: I0313 10:43:14.482588 17876 scope.go:117] "RemoveContainer" containerID="24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2" Mar 13 10:43:14.496332 master-0 kubenswrapper[17876]: I0313 10:43:14.496293 17876 scope.go:117] "RemoveContainer" containerID="f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6" Mar 13 10:43:14.507960 master-0 kubenswrapper[17876]: I0313 10:43:14.507898 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c3280e9367536f782caf8bdc07edb85" path="/var/lib/kubelet/pods/4c3280e9367536f782caf8bdc07edb85/volumes" Mar 13 10:43:14.518331 master-0 kubenswrapper[17876]: I0313 10:43:14.518030 17876 scope.go:117] "RemoveContainer" containerID="ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f" Mar 13 10:43:14.548303 master-0 kubenswrapper[17876]: I0313 10:43:14.548248 17876 scope.go:117] "RemoveContainer" containerID="b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d" Mar 13 10:43:14.570695 master-0 kubenswrapper[17876]: I0313 10:43:14.570643 17876 scope.go:117] "RemoveContainer" containerID="5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc" Mar 13 10:43:14.571453 master-0 kubenswrapper[17876]: E0313 10:43:14.571395 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc\": container with ID starting with 5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc not found: ID does not exist" containerID="5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc" Mar 13 10:43:14.571521 master-0 kubenswrapper[17876]: I0313 10:43:14.571465 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc"} err="failed to get container status \"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc\": rpc error: code = NotFound desc = could not find container \"5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc\": container with ID starting with 5699d54730f78e77fbe779bcc231fdf127940a389f8867712053170b32159fcc not found: ID does not exist" Mar 13 10:43:14.571521 master-0 kubenswrapper[17876]: I0313 10:43:14.571498 17876 scope.go:117] "RemoveContainer" containerID="b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12" Mar 13 10:43:14.572025 master-0 kubenswrapper[17876]: E0313 10:43:14.571984 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12\": container with ID starting with b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12 not found: ID does not exist" containerID="b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12" Mar 13 10:43:14.572115 master-0 kubenswrapper[17876]: I0313 10:43:14.572021 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12"} err="failed to get container status \"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12\": rpc error: code = NotFound desc = could not find container \"b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12\": container with ID starting with b7a70dd5f6cba9145872fa2a82e113555139c189ac09a95330b8aa95e3905b12 not found: ID does not exist" Mar 13 10:43:14.572115 master-0 kubenswrapper[17876]: I0313 10:43:14.572047 17876 scope.go:117] "RemoveContainer" containerID="24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2" Mar 13 10:43:14.572729 master-0 kubenswrapper[17876]: E0313 10:43:14.572626 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2\": container with ID starting with 24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2 not found: ID does not exist" containerID="24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2" Mar 13 10:43:14.572981 master-0 kubenswrapper[17876]: I0313 10:43:14.572712 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2"} err="failed to get container status \"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2\": rpc error: code = NotFound desc = could not find container \"24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2\": container with ID starting with 24aaeebbd98e7e2a5e3ad8164b4e44123ff419909a7002a5c3e5d58531dd14f2 not found: ID does not exist" Mar 13 10:43:14.572981 master-0 kubenswrapper[17876]: I0313 10:43:14.572762 17876 scope.go:117] "RemoveContainer" containerID="f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6" Mar 13 10:43:14.573391 master-0 kubenswrapper[17876]: E0313 10:43:14.573344 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6\": container with ID starting with f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6 not found: ID does not exist" containerID="f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6" Mar 13 10:43:14.573391 master-0 kubenswrapper[17876]: I0313 10:43:14.573380 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6"} err="failed to get container status \"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6\": rpc error: code = NotFound desc = could not find container \"f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6\": container with ID starting with f8f9ccfff4a83d829f88af72a40e6de446b1146a25b37ca9beda22052665b6c6 not found: ID does not exist" Mar 13 10:43:14.573579 master-0 kubenswrapper[17876]: I0313 10:43:14.573401 17876 scope.go:117] "RemoveContainer" containerID="ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f" Mar 13 10:43:14.573975 master-0 kubenswrapper[17876]: E0313 10:43:14.573929 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f\": container with ID starting with ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f not found: ID does not exist" containerID="ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f" Mar 13 10:43:14.574136 master-0 kubenswrapper[17876]: I0313 10:43:14.573976 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f"} err="failed to get container status \"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f\": rpc error: code = NotFound desc = could not find container \"ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f\": container with ID starting with ad06ce952fe708a29b278fe1666b256d20ac9e45a3f8820892a71d6071287e7f not found: ID does not exist" Mar 13 10:43:14.574136 master-0 kubenswrapper[17876]: I0313 10:43:14.574006 17876 scope.go:117] "RemoveContainer" containerID="b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d" Mar 13 10:43:14.574494 master-0 kubenswrapper[17876]: E0313 10:43:14.574447 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d\": container with ID starting with b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d not found: ID does not exist" containerID="b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d" Mar 13 10:43:14.574494 master-0 kubenswrapper[17876]: I0313 10:43:14.574478 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d"} err="failed to get container status \"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d\": rpc error: code = NotFound desc = could not find container \"b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d\": container with ID starting with b808692f4c8c865499eef41427d0f07e92b8085e5c0dd032ea4a049308644a3d not found: ID does not exist" Mar 13 10:43:18.544238 master-0 kubenswrapper[17876]: I0313 10:43:18.544065 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" containerName="oauth-openshift" containerID="cri-o://ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b" gracePeriod=15 Mar 13 10:43:19.239974 master-0 kubenswrapper[17876]: I0313 10:43:19.239936 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:43:19.241261 master-0 kubenswrapper[17876]: I0313 10:43:19.241180 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.242056 master-0 kubenswrapper[17876]: I0313 10:43:19.242013 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.242861 master-0 kubenswrapper[17876]: I0313 10:43:19.242817 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.414334 master-0 kubenswrapper[17876]: I0313 10:43:19.414261 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f8fwn\" (UniqueName: \"kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414334 master-0 kubenswrapper[17876]: I0313 10:43:19.414334 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414381 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414430 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414474 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414511 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414556 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414591 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414662 master-0 kubenswrapper[17876]: I0313 10:43:19.414631 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414956 master-0 kubenswrapper[17876]: I0313 10:43:19.414684 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414956 master-0 kubenswrapper[17876]: I0313 10:43:19.414724 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414956 master-0 kubenswrapper[17876]: I0313 10:43:19.414763 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.414956 master-0 kubenswrapper[17876]: I0313 10:43:19.414894 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:19.415113 master-0 kubenswrapper[17876]: I0313 10:43:19.415034 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session\") pod \"8cfba211-2658-42fe-ac6b-6b6cba002b99\" (UID: \"8cfba211-2658-42fe-ac6b-6b6cba002b99\") " Mar 13 10:43:19.415417 master-0 kubenswrapper[17876]: I0313 10:43:19.415375 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.415618 master-0 kubenswrapper[17876]: I0313 10:43:19.415538 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:43:19.415739 master-0 kubenswrapper[17876]: I0313 10:43:19.415694 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:43:19.416452 master-0 kubenswrapper[17876]: I0313 10:43:19.416293 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:43:19.416904 master-0 kubenswrapper[17876]: I0313 10:43:19.416538 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:43:19.418706 master-0 kubenswrapper[17876]: I0313 10:43:19.418635 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.418852 master-0 kubenswrapper[17876]: I0313 10:43:19.418797 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.419018 master-0 kubenswrapper[17876]: I0313 10:43:19.418909 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.419255 master-0 kubenswrapper[17876]: I0313 10:43:19.419099 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.419539 master-0 kubenswrapper[17876]: I0313 10:43:19.419476 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn" (OuterVolumeSpecName: "kube-api-access-f8fwn") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "kube-api-access-f8fwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:43:19.420110 master-0 kubenswrapper[17876]: I0313 10:43:19.420045 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.420874 master-0 kubenswrapper[17876]: I0313 10:43:19.420771 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.421423 master-0 kubenswrapper[17876]: I0313 10:43:19.421372 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "8cfba211-2658-42fe-ac6b-6b6cba002b99" (UID: "8cfba211-2658-42fe-ac6b-6b6cba002b99"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:43:19.505624 master-0 kubenswrapper[17876]: I0313 10:43:19.505560 17876 generic.go:334] "Generic (PLEG): container finished" podID="8cfba211-2658-42fe-ac6b-6b6cba002b99" containerID="ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b" exitCode=0 Mar 13 10:43:19.505624 master-0 kubenswrapper[17876]: I0313 10:43:19.505613 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" Mar 13 10:43:19.505624 master-0 kubenswrapper[17876]: I0313 10:43:19.505625 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" event={"ID":"8cfba211-2658-42fe-ac6b-6b6cba002b99","Type":"ContainerDied","Data":"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b"} Mar 13 10:43:19.506031 master-0 kubenswrapper[17876]: I0313 10:43:19.505663 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" event={"ID":"8cfba211-2658-42fe-ac6b-6b6cba002b99","Type":"ContainerDied","Data":"2c3c6c2635030ddb4fb4810666d13cc2ee4a1ad8653c62eed8f5a0f4b3d61fa9"} Mar 13 10:43:19.506031 master-0 kubenswrapper[17876]: I0313 10:43:19.505695 17876 scope.go:117] "RemoveContainer" containerID="ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b" Mar 13 10:43:19.506617 master-0 kubenswrapper[17876]: I0313 10:43:19.506576 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.508000 master-0 kubenswrapper[17876]: I0313 10:43:19.507224 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.508000 master-0 kubenswrapper[17876]: I0313 10:43:19.507776 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.516070 master-0 kubenswrapper[17876]: I0313 10:43:19.516010 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516070 master-0 kubenswrapper[17876]: I0313 10:43:19.516053 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516070 master-0 kubenswrapper[17876]: I0313 10:43:19.516070 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516070 master-0 kubenswrapper[17876]: I0313 10:43:19.516083 17876 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516148 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516166 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516179 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516193 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516206 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516218 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f8fwn\" (UniqueName: \"kubernetes.io/projected/8cfba211-2658-42fe-ac6b-6b6cba002b99-kube-api-access-f8fwn\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516232 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.516505 master-0 kubenswrapper[17876]: I0313 10:43:19.516244 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/8cfba211-2658-42fe-ac6b-6b6cba002b99-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:19.529918 master-0 kubenswrapper[17876]: I0313 10:43:19.529736 17876 scope.go:117] "RemoveContainer" containerID="ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b" Mar 13 10:43:19.530646 master-0 kubenswrapper[17876]: E0313 10:43:19.530602 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b\": container with ID starting with ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b not found: ID does not exist" containerID="ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b" Mar 13 10:43:19.530791 master-0 kubenswrapper[17876]: I0313 10:43:19.530644 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b"} err="failed to get container status \"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b\": rpc error: code = NotFound desc = could not find container \"ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b\": container with ID starting with ecac698da1a3d3bed06986d6ad9307759f2acd1df674990ef57a57afd66c020b not found: ID does not exist" Mar 13 10:43:19.533829 master-0 kubenswrapper[17876]: I0313 10:43:19.533747 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.534500 master-0 kubenswrapper[17876]: I0313 10:43:19.534456 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.535188 master-0 kubenswrapper[17876]: I0313 10:43:19.535127 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.546409 master-0 kubenswrapper[17876]: E0313 10:43:19.546165 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c609f93369600 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:899242a15b2bdf3b4a04fb323647ca94,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:43:11.763027456 +0000 UTC m=+99.598833922,LastTimestamp:2026-03-13 10:43:11.763027456 +0000 UTC m=+99.598833922,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:43:19.733516 master-0 kubenswrapper[17876]: E0313 10:43:19.733276 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.734404 master-0 kubenswrapper[17876]: E0313 10:43:19.734329 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.735615 master-0 kubenswrapper[17876]: E0313 10:43:19.735515 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.736655 master-0 kubenswrapper[17876]: E0313 10:43:19.736564 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.737672 master-0 kubenswrapper[17876]: E0313 10:43:19.737603 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:19.737740 master-0 kubenswrapper[17876]: I0313 10:43:19.737707 17876 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 10:43:19.738904 master-0 kubenswrapper[17876]: E0313 10:43:19.738777 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 10:43:19.939478 master-0 kubenswrapper[17876]: E0313 10:43:19.939397 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 10:43:20.330504 master-0 kubenswrapper[17876]: I0313 10:43:20.330445 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:20.330855 master-0 kubenswrapper[17876]: I0313 10:43:20.330539 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:20.341239 master-0 kubenswrapper[17876]: E0313 10:43:20.341172 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 10:43:21.143036 master-0 kubenswrapper[17876]: E0313 10:43:21.142871 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 10:43:22.497810 master-0 kubenswrapper[17876]: I0313 10:43:22.497704 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:22.501032 master-0 kubenswrapper[17876]: I0313 10:43:22.498477 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:22.501032 master-0 kubenswrapper[17876]: I0313 10:43:22.499153 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:22.738269 master-0 kubenswrapper[17876]: I0313 10:43:22.738143 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:22.738269 master-0 kubenswrapper[17876]: I0313 10:43:22.738246 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:22.744698 master-0 kubenswrapper[17876]: E0313 10:43:22.744604 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 10:43:23.152044 master-0 kubenswrapper[17876]: E0313 10:43:23.151856 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:43:23Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:43:23Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:43:23Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:43:23Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:1295a1f0e74ae87f51a733e28b64c6fdb6b9a5b069a6897b3870fe52cc1c3b0b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:505eeaa3f051e9f4ea6a622aca92e5c4eae07078ca185d9fecfe8cc9b6dfc899\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1739173859},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:4855408bd0e4d0711383d0c14dcad53c98255ff9f83f6cbefb57e47eacc1f1f1\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:97bdbb5854e4ad7976209a44cff02c8a2b9542f58ad007c06a5c3a5e8266def1\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1284762325},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:898c67bf7fc973e99114f3148976a6c21ae0dbe413051415588fa9b995f5b331\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:a641939d2096609a4cf6eec872a1476b7c671bfd81cffc2edeb6e9f13c9deeba\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1231028434},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8c978bb5c329452b181f61f00452b4c2bfd83d245db56050bc7607972a791a76\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:e6567accc084db971e077b5ca666357e3a326fa27f69fc7135a5bc2e19f998eb\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1221745369},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.152672 master-0 kubenswrapper[17876]: E0313 10:43:23.152638 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.153221 master-0 kubenswrapper[17876]: E0313 10:43:23.153178 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.153759 master-0 kubenswrapper[17876]: E0313 10:43:23.153728 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.154254 master-0 kubenswrapper[17876]: E0313 10:43:23.154221 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.154254 master-0 kubenswrapper[17876]: E0313 10:43:23.154249 17876 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 10:43:23.493933 master-0 kubenswrapper[17876]: I0313 10:43:23.493762 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:23.496049 master-0 kubenswrapper[17876]: I0313 10:43:23.495929 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.497587 master-0 kubenswrapper[17876]: I0313 10:43:23.497436 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.498958 master-0 kubenswrapper[17876]: I0313 10:43:23.498781 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:23.533970 master-0 kubenswrapper[17876]: I0313 10:43:23.533907 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:23.533970 master-0 kubenswrapper[17876]: I0313 10:43:23.533966 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:23.536127 master-0 kubenswrapper[17876]: E0313 10:43:23.536019 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:23.536882 master-0 kubenswrapper[17876]: I0313 10:43:23.536841 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:23.560770 master-0 kubenswrapper[17876]: W0313 10:43:23.560689 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod077dd10388b9e3e48a07382126e86621.slice/crio-b413150e6afebb180c0677dd5cb81adcb68e955f18016460771c3054be311e5d WatchSource:0}: Error finding container b413150e6afebb180c0677dd5cb81adcb68e955f18016460771c3054be311e5d: Status 404 returned error can't find the container with id b413150e6afebb180c0677dd5cb81adcb68e955f18016460771c3054be311e5d Mar 13 10:43:24.541305 master-0 kubenswrapper[17876]: I0313 10:43:24.541235 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29" exitCode=0 Mar 13 10:43:24.541791 master-0 kubenswrapper[17876]: I0313 10:43:24.541314 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29"} Mar 13 10:43:24.541791 master-0 kubenswrapper[17876]: I0313 10:43:24.541356 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"b413150e6afebb180c0677dd5cb81adcb68e955f18016460771c3054be311e5d"} Mar 13 10:43:24.541791 master-0 kubenswrapper[17876]: I0313 10:43:24.541653 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:24.541791 master-0 kubenswrapper[17876]: I0313 10:43:24.541672 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:24.542432 master-0 kubenswrapper[17876]: E0313 10:43:24.542390 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:24.542554 master-0 kubenswrapper[17876]: I0313 10:43:24.542493 17876 status_manager.go:851] "Failed to get status for pod" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" pod="openshift-kube-apiserver/installer-3-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-3-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:24.543690 master-0 kubenswrapper[17876]: I0313 10:43:24.543557 17876 status_manager.go:851] "Failed to get status for pod" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" pod="openshift-authentication/oauth-openshift-65f5b9dbcc-62t45" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-65f5b9dbcc-62t45\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:24.544368 master-0 kubenswrapper[17876]: I0313 10:43:24.544318 17876 status_manager.go:851] "Failed to get status for pod" podUID="899242a15b2bdf3b4a04fb323647ca94" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:43:25.555537 master-0 kubenswrapper[17876]: I0313 10:43:25.555500 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852"} Mar 13 10:43:25.556067 master-0 kubenswrapper[17876]: I0313 10:43:25.556049 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1"} Mar 13 10:43:25.563220 master-0 kubenswrapper[17876]: I0313 10:43:25.563152 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2"} Mar 13 10:43:25.563503 master-0 kubenswrapper[17876]: I0313 10:43:25.563475 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38"} Mar 13 10:43:26.568251 master-0 kubenswrapper[17876]: I0313 10:43:26.567516 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"8e8f8f57a168775c200793e552fc98f6fa3129d85e0ede2ff0d1df9451ff0848"} Mar 13 10:43:26.568251 master-0 kubenswrapper[17876]: I0313 10:43:26.567852 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:26.568251 master-0 kubenswrapper[17876]: I0313 10:43:26.567870 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:26.568251 master-0 kubenswrapper[17876]: I0313 10:43:26.568176 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:26.583623 master-0 kubenswrapper[17876]: I0313 10:43:26.583563 17876 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69" exitCode=1 Mar 13 10:43:26.583623 master-0 kubenswrapper[17876]: I0313 10:43:26.583623 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69"} Mar 13 10:43:26.584025 master-0 kubenswrapper[17876]: I0313 10:43:26.583665 17876 scope.go:117] "RemoveContainer" containerID="7fa729ef4de02e4f8d7a6b9f78196bb19227b918e6f5b9a633c6ec84c568c7fe" Mar 13 10:43:26.584616 master-0 kubenswrapper[17876]: I0313 10:43:26.584560 17876 scope.go:117] "RemoveContainer" containerID="cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69" Mar 13 10:43:27.595534 master-0 kubenswrapper[17876]: I0313 10:43:27.595314 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"ff58356cafd17211ab03ac0b3de2df04e88ec6642de92ac89ae8e6565eaf0c07"} Mar 13 10:43:28.540501 master-0 kubenswrapper[17876]: I0313 10:43:28.540393 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:28.540501 master-0 kubenswrapper[17876]: I0313 10:43:28.540454 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:28.545120 master-0 kubenswrapper[17876]: I0313 10:43:28.545059 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:30.340145 master-0 kubenswrapper[17876]: I0313 10:43:30.339959 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:30.342005 master-0 kubenswrapper[17876]: I0313 10:43:30.340163 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:31.606498 master-0 kubenswrapper[17876]: I0313 10:43:31.606424 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:32.513641 master-0 kubenswrapper[17876]: I0313 10:43:32.513519 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:43:32.685134 master-0 kubenswrapper[17876]: I0313 10:43:32.685022 17876 scope.go:117] "RemoveContainer" containerID="6bd307155c0397e849a532ef6dcebc4cbbbf850ed4d002b219c4c046ec36c6b8" Mar 13 10:43:32.719725 master-0 kubenswrapper[17876]: I0313 10:43:32.719563 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-check-endpoints/0.log" Mar 13 10:43:32.723502 master-0 kubenswrapper[17876]: I0313 10:43:32.723415 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="8e8f8f57a168775c200793e552fc98f6fa3129d85e0ede2ff0d1df9451ff0848" exitCode=255 Mar 13 10:43:32.724466 master-0 kubenswrapper[17876]: I0313 10:43:32.724365 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:32.724526 master-0 kubenswrapper[17876]: I0313 10:43:32.724480 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:32.724876 master-0 kubenswrapper[17876]: I0313 10:43:32.724790 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerDied","Data":"8e8f8f57a168775c200793e552fc98f6fa3129d85e0ede2ff0d1df9451ff0848"} Mar 13 10:43:32.727808 master-0 kubenswrapper[17876]: I0313 10:43:32.727753 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:43:32.728289 master-0 kubenswrapper[17876]: I0313 10:43:32.728261 17876 scope.go:117] "RemoveContainer" containerID="8e8f8f57a168775c200793e552fc98f6fa3129d85e0ede2ff0d1df9451ff0848" Mar 13 10:43:32.731180 master-0 kubenswrapper[17876]: I0313 10:43:32.731145 17876 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" Mar 13 10:43:32.731264 master-0 kubenswrapper[17876]: I0313 10:43:32.731183 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:32.737400 master-0 kubenswrapper[17876]: I0313 10:43:32.737362 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:32.737487 master-0 kubenswrapper[17876]: I0313 10:43:32.737421 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:33.736021 master-0 kubenswrapper[17876]: I0313 10:43:33.735958 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-check-endpoints/0.log" Mar 13 10:43:33.738294 master-0 kubenswrapper[17876]: I0313 10:43:33.738253 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"077dd10388b9e3e48a07382126e86621","Type":"ContainerStarted","Data":"7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e"} Mar 13 10:43:33.738470 master-0 kubenswrapper[17876]: I0313 10:43:33.738445 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:33.738547 master-0 kubenswrapper[17876]: I0313 10:43:33.738526 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:33.738597 master-0 kubenswrapper[17876]: I0313 10:43:33.738549 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:33.742152 master-0 kubenswrapper[17876]: I0313 10:43:33.742119 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:43:34.312385 master-0 kubenswrapper[17876]: I0313 10:43:34.312272 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:43:34.312717 master-0 kubenswrapper[17876]: I0313 10:43:34.312532 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:43:34.318077 master-0 kubenswrapper[17876]: I0313 10:43:34.318030 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:43:34.756649 master-0 kubenswrapper[17876]: I0313 10:43:34.756078 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:34.756649 master-0 kubenswrapper[17876]: I0313 10:43:34.756177 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e7a78b00-f81c-49d4-a9e3-2380ab937aad" Mar 13 10:43:34.760487 master-0 kubenswrapper[17876]: I0313 10:43:34.760405 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:43:40.330712 master-0 kubenswrapper[17876]: I0313 10:43:40.330653 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:40.332314 master-0 kubenswrapper[17876]: I0313 10:43:40.330723 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:40.884916 master-0 kubenswrapper[17876]: I0313 10:43:40.884824 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:43:41.864242 master-0 kubenswrapper[17876]: I0313 10:43:41.864168 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 10:43:42.738320 master-0 kubenswrapper[17876]: I0313 10:43:42.738250 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:42.738668 master-0 kubenswrapper[17876]: I0313 10:43:42.738338 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:42.748877 master-0 kubenswrapper[17876]: I0313 10:43:42.748808 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:43:42.784782 master-0 kubenswrapper[17876]: I0313 10:43:42.784710 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:43:43.282012 master-0 kubenswrapper[17876]: I0313 10:43:43.281905 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:43:43.314974 master-0 kubenswrapper[17876]: I0313 10:43:43.314869 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 10:43:43.578732 master-0 kubenswrapper[17876]: I0313 10:43:43.578551 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:43:43.651372 master-0 kubenswrapper[17876]: I0313 10:43:43.651250 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:43:44.037414 master-0 kubenswrapper[17876]: I0313 10:43:44.037330 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:43:44.209734 master-0 kubenswrapper[17876]: I0313 10:43:44.209592 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:43:44.318585 master-0 kubenswrapper[17876]: I0313 10:43:44.318396 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:43:44.499607 master-0 kubenswrapper[17876]: I0313 10:43:44.499496 17876 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:43:44.504984 master-0 kubenswrapper[17876]: I0313 10:43:44.504844 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=33.504777682 podStartE2EDuration="33.504777682s" podCreationTimestamp="2026-03-13 10:43:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:43:31.739768534 +0000 UTC m=+119.575575020" watchObservedRunningTime="2026-03-13 10:43:44.504777682 +0000 UTC m=+132.340584188" Mar 13 10:43:44.511483 master-0 kubenswrapper[17876]: I0313 10:43:44.511418 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-authentication/oauth-openshift-65f5b9dbcc-62t45"] Mar 13 10:43:44.511646 master-0 kubenswrapper[17876]: I0313 10:43:44.511524 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:43:44.519378 master-0 kubenswrapper[17876]: I0313 10:43:44.519334 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:43:44.537783 master-0 kubenswrapper[17876]: I0313 10:43:44.537663 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=13.537631926 podStartE2EDuration="13.537631926s" podCreationTimestamp="2026-03-13 10:43:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:43:44.534596418 +0000 UTC m=+132.370402894" watchObservedRunningTime="2026-03-13 10:43:44.537631926 +0000 UTC m=+132.373438442" Mar 13 10:43:44.589410 master-0 kubenswrapper[17876]: I0313 10:43:44.589290 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:43:44.606126 master-0 kubenswrapper[17876]: I0313 10:43:44.606077 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 10:43:44.712869 master-0 kubenswrapper[17876]: I0313 10:43:44.712807 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:43:44.732165 master-0 kubenswrapper[17876]: I0313 10:43:44.732077 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:43:44.814215 master-0 kubenswrapper[17876]: I0313 10:43:44.814148 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 10:43:44.962871 master-0 kubenswrapper[17876]: I0313 10:43:44.962819 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:43:45.028339 master-0 kubenswrapper[17876]: I0313 10:43:45.028263 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 10:43:45.041359 master-0 kubenswrapper[17876]: I0313 10:43:45.041248 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:43:45.072149 master-0 kubenswrapper[17876]: I0313 10:43:45.072006 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 10:43:45.207278 master-0 kubenswrapper[17876]: I0313 10:43:45.207160 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:43:45.236811 master-0 kubenswrapper[17876]: I0313 10:43:45.236630 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 10:43:45.237087 master-0 kubenswrapper[17876]: I0313 10:43:45.237064 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:43:45.238422 master-0 kubenswrapper[17876]: I0313 10:43:45.238360 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 10:43:45.243306 master-0 kubenswrapper[17876]: I0313 10:43:45.243255 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-5lpgr" Mar 13 10:43:45.262281 master-0 kubenswrapper[17876]: I0313 10:43:45.262173 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 10:43:45.283222 master-0 kubenswrapper[17876]: I0313 10:43:45.283177 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 10:43:45.297011 master-0 kubenswrapper[17876]: I0313 10:43:45.296929 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:43:45.373965 master-0 kubenswrapper[17876]: I0313 10:43:45.373881 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 10:43:45.470789 master-0 kubenswrapper[17876]: I0313 10:43:45.470726 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:43:45.607182 master-0 kubenswrapper[17876]: I0313 10:43:45.607016 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-jckzr" Mar 13 10:43:45.622986 master-0 kubenswrapper[17876]: I0313 10:43:45.622944 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 10:43:45.640086 master-0 kubenswrapper[17876]: I0313 10:43:45.640023 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 10:43:45.659762 master-0 kubenswrapper[17876]: I0313 10:43:45.659673 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:43:45.688150 master-0 kubenswrapper[17876]: I0313 10:43:45.688050 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:43:45.718630 master-0 kubenswrapper[17876]: I0313 10:43:45.718567 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 10:43:45.795267 master-0 kubenswrapper[17876]: I0313 10:43:45.795194 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 10:43:45.882523 master-0 kubenswrapper[17876]: I0313 10:43:45.882439 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:43:46.028563 master-0 kubenswrapper[17876]: I0313 10:43:46.028471 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 10:43:46.103654 master-0 kubenswrapper[17876]: I0313 10:43:46.103579 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:43:46.167854 master-0 kubenswrapper[17876]: I0313 10:43:46.167677 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:43:46.233906 master-0 kubenswrapper[17876]: I0313 10:43:46.233819 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:43:46.283936 master-0 kubenswrapper[17876]: I0313 10:43:46.283844 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:43:46.307522 master-0 kubenswrapper[17876]: I0313 10:43:46.307448 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:43:46.385007 master-0 kubenswrapper[17876]: I0313 10:43:46.384915 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:43:46.422532 master-0 kubenswrapper[17876]: I0313 10:43:46.422391 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:43:46.425254 master-0 kubenswrapper[17876]: I0313 10:43:46.425212 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:43:46.505273 master-0 kubenswrapper[17876]: I0313 10:43:46.505209 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 10:43:46.505829 master-0 kubenswrapper[17876]: I0313 10:43:46.505769 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" path="/var/lib/kubelet/pods/8cfba211-2658-42fe-ac6b-6b6cba002b99/volumes" Mar 13 10:43:46.542734 master-0 kubenswrapper[17876]: I0313 10:43:46.542667 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:43:46.559590 master-0 kubenswrapper[17876]: I0313 10:43:46.559542 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:43:46.599064 master-0 kubenswrapper[17876]: I0313 10:43:46.599004 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:43:46.679967 master-0 kubenswrapper[17876]: I0313 10:43:46.679837 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:43:46.818349 master-0 kubenswrapper[17876]: I0313 10:43:46.818300 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:43:46.822199 master-0 kubenswrapper[17876]: I0313 10:43:46.822121 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 10:43:46.836869 master-0 kubenswrapper[17876]: I0313 10:43:46.836807 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:43:46.892605 master-0 kubenswrapper[17876]: I0313 10:43:46.892537 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:43:46.909930 master-0 kubenswrapper[17876]: I0313 10:43:46.909856 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:43:46.922815 master-0 kubenswrapper[17876]: I0313 10:43:46.922755 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:43:46.923001 master-0 kubenswrapper[17876]: I0313 10:43:46.922969 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 10:43:47.025360 master-0 kubenswrapper[17876]: I0313 10:43:47.025240 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 10:43:47.026600 master-0 kubenswrapper[17876]: I0313 10:43:47.026448 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:43:47.133822 master-0 kubenswrapper[17876]: I0313 10:43:47.133647 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:43:47.187261 master-0 kubenswrapper[17876]: I0313 10:43:47.187177 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 10:43:47.218941 master-0 kubenswrapper[17876]: I0313 10:43:47.218858 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:43:47.242147 master-0 kubenswrapper[17876]: I0313 10:43:47.242055 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:43:47.266449 master-0 kubenswrapper[17876]: I0313 10:43:47.265872 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:43:47.396428 master-0 kubenswrapper[17876]: I0313 10:43:47.396369 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 10:43:47.433504 master-0 kubenswrapper[17876]: I0313 10:43:47.433465 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-t57pn" Mar 13 10:43:47.477456 master-0 kubenswrapper[17876]: I0313 10:43:47.477394 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:43:47.506740 master-0 kubenswrapper[17876]: I0313 10:43:47.506686 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 10:43:47.562166 master-0 kubenswrapper[17876]: I0313 10:43:47.562059 17876 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:43:47.741631 master-0 kubenswrapper[17876]: I0313 10:43:47.739750 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 10:43:47.751135 master-0 kubenswrapper[17876]: I0313 10:43:47.751057 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-tvfvf" Mar 13 10:43:47.753599 master-0 kubenswrapper[17876]: I0313 10:43:47.753563 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 10:43:47.907341 master-0 kubenswrapper[17876]: I0313 10:43:47.907274 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 10:43:47.937550 master-0 kubenswrapper[17876]: I0313 10:43:47.937496 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:43:47.983444 master-0 kubenswrapper[17876]: I0313 10:43:47.983390 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 10:43:48.062760 master-0 kubenswrapper[17876]: I0313 10:43:48.062625 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-24kvc" Mar 13 10:43:48.104910 master-0 kubenswrapper[17876]: I0313 10:43:48.104852 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 10:43:48.137849 master-0 kubenswrapper[17876]: I0313 10:43:48.137810 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-c9gmn" Mar 13 10:43:48.153371 master-0 kubenswrapper[17876]: I0313 10:43:48.153314 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:43:48.198493 master-0 kubenswrapper[17876]: I0313 10:43:48.198226 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:43:48.241857 master-0 kubenswrapper[17876]: I0313 10:43:48.241810 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:43:48.266544 master-0 kubenswrapper[17876]: I0313 10:43:48.266489 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:43:48.310860 master-0 kubenswrapper[17876]: I0313 10:43:48.310803 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:43:48.345738 master-0 kubenswrapper[17876]: I0313 10:43:48.344891 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:43:48.345738 master-0 kubenswrapper[17876]: E0313 10:43:48.345567 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" containerName="oauth-openshift" Mar 13 10:43:48.345738 master-0 kubenswrapper[17876]: I0313 10:43:48.345605 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" containerName="oauth-openshift" Mar 13 10:43:48.345738 master-0 kubenswrapper[17876]: E0313 10:43:48.345641 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" containerName="installer" Mar 13 10:43:48.345738 master-0 kubenswrapper[17876]: I0313 10:43:48.345651 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" containerName="installer" Mar 13 10:43:48.346267 master-0 kubenswrapper[17876]: I0313 10:43:48.346048 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8cfba211-2658-42fe-ac6b-6b6cba002b99" containerName="oauth-openshift" Mar 13 10:43:48.346267 master-0 kubenswrapper[17876]: I0313 10:43:48.346087 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bb1dcdd-03f9-4a09-868f-c574cd2e13ab" containerName="installer" Mar 13 10:43:48.348543 master-0 kubenswrapper[17876]: I0313 10:43:48.347202 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.350483 master-0 kubenswrapper[17876]: I0313 10:43:48.350371 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 10:43:48.350483 master-0 kubenswrapper[17876]: I0313 10:43:48.350371 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.350704 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.350853 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.350883 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.350944 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.350914 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 10:43:48.351140 master-0 kubenswrapper[17876]: I0313 10:43:48.351124 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 10:43:48.351601 master-0 kubenswrapper[17876]: I0313 10:43:48.351444 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 10:43:48.351601 master-0 kubenswrapper[17876]: I0313 10:43:48.351522 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-kkkpw" Mar 13 10:43:48.353956 master-0 kubenswrapper[17876]: I0313 10:43:48.353800 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 10:43:48.357808 master-0 kubenswrapper[17876]: I0313 10:43:48.357713 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 10:43:48.358843 master-0 kubenswrapper[17876]: I0313 10:43:48.358788 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 10:43:48.370156 master-0 kubenswrapper[17876]: I0313 10:43:48.370046 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 10:43:48.405452 master-0 kubenswrapper[17876]: I0313 10:43:48.405192 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:43:48.485163 master-0 kubenswrapper[17876]: I0313 10:43:48.485086 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:43:48.491684 master-0 kubenswrapper[17876]: I0313 10:43:48.491645 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-2p9p4" Mar 13 10:43:48.511528 master-0 kubenswrapper[17876]: I0313 10:43:48.511453 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.511528 master-0 kubenswrapper[17876]: I0313 10:43:48.511515 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.511528 master-0 kubenswrapper[17876]: I0313 10:43:48.511540 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.511999 master-0 kubenswrapper[17876]: I0313 10:43:48.511555 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:43:48.511999 master-0 kubenswrapper[17876]: I0313 10:43:48.511560 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.511999 master-0 kubenswrapper[17876]: I0313 10:43:48.511849 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.511999 master-0 kubenswrapper[17876]: I0313 10:43:48.511940 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcl2w\" (UniqueName: \"kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512345 master-0 kubenswrapper[17876]: I0313 10:43:48.512005 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512345 master-0 kubenswrapper[17876]: I0313 10:43:48.512075 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512345 master-0 kubenswrapper[17876]: I0313 10:43:48.512175 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512345 master-0 kubenswrapper[17876]: I0313 10:43:48.512274 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512345 master-0 kubenswrapper[17876]: I0313 10:43:48.512308 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512622 master-0 kubenswrapper[17876]: I0313 10:43:48.512371 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.512622 master-0 kubenswrapper[17876]: I0313 10:43:48.512427 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.525647 master-0 kubenswrapper[17876]: I0313 10:43:48.525591 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-clxlg" Mar 13 10:43:48.613969 master-0 kubenswrapper[17876]: I0313 10:43:48.613807 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614191 master-0 kubenswrapper[17876]: I0313 10:43:48.613860 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614191 master-0 kubenswrapper[17876]: I0313 10:43:48.614067 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614335 master-0 kubenswrapper[17876]: I0313 10:43:48.614303 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614503 master-0 kubenswrapper[17876]: I0313 10:43:48.614472 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614565 master-0 kubenswrapper[17876]: I0313 10:43:48.614508 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614746 master-0 kubenswrapper[17876]: I0313 10:43:48.614711 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614827 master-0 kubenswrapper[17876]: I0313 10:43:48.614762 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcl2w\" (UniqueName: \"kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614827 master-0 kubenswrapper[17876]: I0313 10:43:48.614798 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614946 master-0 kubenswrapper[17876]: I0313 10:43:48.614830 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614946 master-0 kubenswrapper[17876]: I0313 10:43:48.614855 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614946 master-0 kubenswrapper[17876]: I0313 10:43:48.614894 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.614946 master-0 kubenswrapper[17876]: I0313 10:43:48.614916 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.615583 master-0 kubenswrapper[17876]: I0313 10:43:48.615508 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.615583 master-0 kubenswrapper[17876]: I0313 10:43:48.615543 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.615710 master-0 kubenswrapper[17876]: I0313 10:43:48.615599 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.615710 master-0 kubenswrapper[17876]: I0313 10:43:48.615665 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.616264 master-0 kubenswrapper[17876]: I0313 10:43:48.616222 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.618833 master-0 kubenswrapper[17876]: I0313 10:43:48.618771 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.618833 master-0 kubenswrapper[17876]: I0313 10:43:48.618810 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.619014 master-0 kubenswrapper[17876]: I0313 10:43:48.618771 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.620813 master-0 kubenswrapper[17876]: I0313 10:43:48.620761 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.624673 master-0 kubenswrapper[17876]: I0313 10:43:48.624627 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.625180 master-0 kubenswrapper[17876]: I0313 10:43:48.625140 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.626440 master-0 kubenswrapper[17876]: I0313 10:43:48.626371 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.640691 master-0 kubenswrapper[17876]: I0313 10:43:48.640634 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-5258b" Mar 13 10:43:48.641738 master-0 kubenswrapper[17876]: I0313 10:43:48.641698 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcl2w\" (UniqueName: \"kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w\") pod \"oauth-openshift-8555c5bbdd-kbpw6\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.664356 master-0 kubenswrapper[17876]: I0313 10:43:48.664312 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:43:48.689814 master-0 kubenswrapper[17876]: I0313 10:43:48.689725 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:43:48.896314 master-0 kubenswrapper[17876]: I0313 10:43:48.896250 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 10:43:48.979376 master-0 kubenswrapper[17876]: I0313 10:43:48.979314 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 10:43:48.983839 master-0 kubenswrapper[17876]: I0313 10:43:48.983795 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-fsw7z" Mar 13 10:43:49.013290 master-0 kubenswrapper[17876]: I0313 10:43:49.013236 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:43:49.034478 master-0 kubenswrapper[17876]: I0313 10:43:49.034227 17876 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:43:49.253757 master-0 kubenswrapper[17876]: I0313 10:43:49.253590 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 10:43:49.365518 master-0 kubenswrapper[17876]: I0313 10:43:49.365411 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:43:49.518789 master-0 kubenswrapper[17876]: I0313 10:43:49.518426 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 10:43:49.518789 master-0 kubenswrapper[17876]: I0313 10:43:49.518757 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:43:49.519451 master-0 kubenswrapper[17876]: I0313 10:43:49.518970 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2p4lb" Mar 13 10:43:49.522401 master-0 kubenswrapper[17876]: I0313 10:43:49.520550 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:43:49.523694 master-0 kubenswrapper[17876]: I0313 10:43:49.522682 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:43:49.532685 master-0 kubenswrapper[17876]: I0313 10:43:49.532632 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 10:43:49.543619 master-0 kubenswrapper[17876]: I0313 10:43:49.543215 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 10:43:49.621014 master-0 kubenswrapper[17876]: I0313 10:43:49.620949 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:43:49.674439 master-0 kubenswrapper[17876]: I0313 10:43:49.674372 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 10:43:49.739336 master-0 kubenswrapper[17876]: I0313 10:43:49.739259 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 10:43:49.770088 master-0 kubenswrapper[17876]: I0313 10:43:49.769914 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:43:49.805580 master-0 kubenswrapper[17876]: I0313 10:43:49.805513 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 10:43:49.816152 master-0 kubenswrapper[17876]: I0313 10:43:49.816064 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 10:43:49.827536 master-0 kubenswrapper[17876]: I0313 10:43:49.827474 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:43:49.868540 master-0 kubenswrapper[17876]: I0313 10:43:49.868412 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:43:49.874804 master-0 kubenswrapper[17876]: I0313 10:43:49.874736 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 10:43:49.884342 master-0 kubenswrapper[17876]: I0313 10:43:49.884281 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-qddwq" Mar 13 10:43:49.918511 master-0 kubenswrapper[17876]: I0313 10:43:49.918447 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:43:50.024716 master-0 kubenswrapper[17876]: I0313 10:43:50.024573 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:43:50.048969 master-0 kubenswrapper[17876]: I0313 10:43:50.048934 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x4n7x" Mar 13 10:43:50.123665 master-0 kubenswrapper[17876]: I0313 10:43:50.123612 17876 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:43:50.131761 master-0 kubenswrapper[17876]: I0313 10:43:50.131501 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 10:43:50.138313 master-0 kubenswrapper[17876]: I0313 10:43:50.138289 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:43:50.232590 master-0 kubenswrapper[17876]: I0313 10:43:50.232368 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 10:43:50.241272 master-0 kubenswrapper[17876]: I0313 10:43:50.241089 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:43:50.256519 master-0 kubenswrapper[17876]: I0313 10:43:50.256456 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 10:43:50.330645 master-0 kubenswrapper[17876]: I0313 10:43:50.330516 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:43:50.330645 master-0 kubenswrapper[17876]: I0313 10:43:50.330563 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:43:50.339812 master-0 kubenswrapper[17876]: I0313 10:43:50.338319 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:43:50.339812 master-0 kubenswrapper[17876]: I0313 10:43:50.339923 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 10:43:50.349460 master-0 kubenswrapper[17876]: I0313 10:43:50.349398 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:43:50.372768 master-0 kubenswrapper[17876]: I0313 10:43:50.372698 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:43:50.446064 master-0 kubenswrapper[17876]: I0313 10:43:50.445991 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:43:50.446911 master-0 kubenswrapper[17876]: I0313 10:43:50.446878 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:43:50.567640 master-0 kubenswrapper[17876]: I0313 10:43:50.567548 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:43:50.688486 master-0 kubenswrapper[17876]: I0313 10:43:50.688176 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 10:43:50.707609 master-0 kubenswrapper[17876]: I0313 10:43:50.707124 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:43:50.768249 master-0 kubenswrapper[17876]: I0313 10:43:50.768192 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 10:43:50.818267 master-0 kubenswrapper[17876]: I0313 10:43:50.818204 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:43:50.997037 master-0 kubenswrapper[17876]: I0313 10:43:50.996887 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 10:43:51.032297 master-0 kubenswrapper[17876]: I0313 10:43:51.032207 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dvqsb" Mar 13 10:43:51.054537 master-0 kubenswrapper[17876]: I0313 10:43:51.054459 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 10:43:51.056845 master-0 kubenswrapper[17876]: I0313 10:43:51.056791 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 10:43:51.194961 master-0 kubenswrapper[17876]: I0313 10:43:51.194899 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 10:43:51.213796 master-0 kubenswrapper[17876]: I0313 10:43:51.213737 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 10:43:51.262679 master-0 kubenswrapper[17876]: I0313 10:43:51.262540 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:43:51.323822 master-0 kubenswrapper[17876]: I0313 10:43:51.323719 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 10:43:51.411243 master-0 kubenswrapper[17876]: I0313 10:43:51.411158 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:43:51.700210 master-0 kubenswrapper[17876]: I0313 10:43:51.699440 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 10:43:51.700210 master-0 kubenswrapper[17876]: I0313 10:43:51.699823 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 10:43:51.767469 master-0 kubenswrapper[17876]: I0313 10:43:51.767392 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:43:51.822328 master-0 kubenswrapper[17876]: I0313 10:43:51.822254 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-ff7d6" Mar 13 10:43:51.841995 master-0 kubenswrapper[17876]: I0313 10:43:51.841925 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 10:43:52.139379 master-0 kubenswrapper[17876]: I0313 10:43:52.139313 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:43:52.157932 master-0 kubenswrapper[17876]: I0313 10:43:52.157835 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 10:43:52.158596 master-0 kubenswrapper[17876]: I0313 10:43:52.158528 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:43:52.185851 master-0 kubenswrapper[17876]: I0313 10:43:52.185428 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 10:43:52.244974 master-0 kubenswrapper[17876]: I0313 10:43:52.244887 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:43:52.254434 master-0 kubenswrapper[17876]: I0313 10:43:52.254369 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:43:52.262058 master-0 kubenswrapper[17876]: I0313 10:43:52.261976 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-5pbvv" Mar 13 10:43:52.270042 master-0 kubenswrapper[17876]: I0313 10:43:52.269976 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 10:43:52.283304 master-0 kubenswrapper[17876]: I0313 10:43:52.283223 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:43:52.301911 master-0 kubenswrapper[17876]: I0313 10:43:52.301832 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 10:43:52.439398 master-0 kubenswrapper[17876]: I0313 10:43:52.439190 17876 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:43:52.587893 master-0 kubenswrapper[17876]: I0313 10:43:52.587791 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 10:43:52.665277 master-0 kubenswrapper[17876]: I0313 10:43:52.665224 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:43:52.667277 master-0 kubenswrapper[17876]: I0313 10:43:52.667249 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:43:52.733817 master-0 kubenswrapper[17876]: I0313 10:43:52.733686 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:43:52.737250 master-0 kubenswrapper[17876]: I0313 10:43:52.737202 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:43:52.737353 master-0 kubenswrapper[17876]: I0313 10:43:52.737250 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:43:52.941942 master-0 kubenswrapper[17876]: I0313 10:43:52.941822 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:43:52.946942 master-0 kubenswrapper[17876]: I0313 10:43:52.946879 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:43:52.955271 master-0 kubenswrapper[17876]: I0313 10:43:52.955218 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:43:53.013319 master-0 kubenswrapper[17876]: I0313 10:43:53.013188 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:43:53.055957 master-0 kubenswrapper[17876]: I0313 10:43:53.055902 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:43:53.093976 master-0 kubenswrapper[17876]: I0313 10:43:53.093912 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 10:43:53.094863 master-0 kubenswrapper[17876]: I0313 10:43:53.094842 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 10:43:53.095041 master-0 kubenswrapper[17876]: I0313 10:43:53.094904 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:43:53.118854 master-0 kubenswrapper[17876]: I0313 10:43:53.118785 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 10:43:53.194312 master-0 kubenswrapper[17876]: I0313 10:43:53.194252 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-c2nqj" Mar 13 10:43:53.243192 master-0 kubenswrapper[17876]: I0313 10:43:53.243088 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:43:53.273644 master-0 kubenswrapper[17876]: I0313 10:43:53.273419 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 10:43:53.275537 master-0 kubenswrapper[17876]: I0313 10:43:53.275493 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:43:53.599781 master-0 kubenswrapper[17876]: I0313 10:43:53.599644 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 10:43:53.618985 master-0 kubenswrapper[17876]: I0313 10:43:53.618873 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-894vf" Mar 13 10:43:53.635838 master-0 kubenswrapper[17876]: I0313 10:43:53.635750 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 10:43:53.648307 master-0 kubenswrapper[17876]: I0313 10:43:53.648226 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:43:53.887310 master-0 kubenswrapper[17876]: I0313 10:43:53.887243 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:43:53.887961 master-0 kubenswrapper[17876]: I0313 10:43:53.887616 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" containerID="cri-o://f39c34196fdd75fee093e6d7425f196a9c2aea8d2fd22351895c1d6588e8828c" gracePeriod=5 Mar 13 10:43:53.897681 master-0 kubenswrapper[17876]: I0313 10:43:53.897625 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:43:53.926956 master-0 kubenswrapper[17876]: I0313 10:43:53.926903 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:43:53.973155 master-0 kubenswrapper[17876]: I0313 10:43:53.973049 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:43:54.043335 master-0 kubenswrapper[17876]: I0313 10:43:54.043259 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:43:54.059033 master-0 kubenswrapper[17876]: I0313 10:43:54.058933 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-d2pmx" Mar 13 10:43:54.071573 master-0 kubenswrapper[17876]: I0313 10:43:54.071498 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 10:43:54.115801 master-0 kubenswrapper[17876]: I0313 10:43:54.115713 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:43:54.193228 master-0 kubenswrapper[17876]: I0313 10:43:54.193058 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:43:54.303758 master-0 kubenswrapper[17876]: I0313 10:43:54.303693 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 10:43:54.308274 master-0 kubenswrapper[17876]: I0313 10:43:54.308238 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 10:43:54.339886 master-0 kubenswrapper[17876]: I0313 10:43:54.339811 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-r9v82" Mar 13 10:43:54.343152 master-0 kubenswrapper[17876]: I0313 10:43:54.343120 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:43:54.353087 master-0 kubenswrapper[17876]: I0313 10:43:54.353027 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 10:43:54.363982 master-0 kubenswrapper[17876]: I0313 10:43:54.363931 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 10:43:54.364448 master-0 kubenswrapper[17876]: I0313 10:43:54.364420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:43:54.385468 master-0 kubenswrapper[17876]: I0313 10:43:54.385402 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:43:54.414238 master-0 kubenswrapper[17876]: I0313 10:43:54.414152 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-h7hlp" Mar 13 10:43:54.449030 master-0 kubenswrapper[17876]: I0313 10:43:54.448842 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:43:54.524809 master-0 kubenswrapper[17876]: I0313 10:43:54.524739 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 10:43:54.543715 master-0 kubenswrapper[17876]: I0313 10:43:54.543635 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:43:54.557562 master-0 kubenswrapper[17876]: I0313 10:43:54.557500 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 10:43:54.570676 master-0 kubenswrapper[17876]: I0313 10:43:54.570621 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:43:54.584385 master-0 kubenswrapper[17876]: I0313 10:43:54.584329 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:43:54.593704 master-0 kubenswrapper[17876]: I0313 10:43:54.593382 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:43:54.606436 master-0 kubenswrapper[17876]: I0313 10:43:54.606371 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:43:54.615461 master-0 kubenswrapper[17876]: I0313 10:43:54.615420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:43:54.843550 master-0 kubenswrapper[17876]: I0313 10:43:54.843346 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 10:43:54.931458 master-0 kubenswrapper[17876]: I0313 10:43:54.931352 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:43:54.938879 master-0 kubenswrapper[17876]: I0313 10:43:54.938826 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:43:54.948573 master-0 kubenswrapper[17876]: I0313 10:43:54.948480 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 10:43:55.026021 master-0 kubenswrapper[17876]: I0313 10:43:55.025965 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 10:43:55.089419 master-0 kubenswrapper[17876]: I0313 10:43:55.089337 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 10:43:55.140590 master-0 kubenswrapper[17876]: I0313 10:43:55.140495 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 10:43:55.219787 master-0 kubenswrapper[17876]: I0313 10:43:55.219677 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 10:43:55.276030 master-0 kubenswrapper[17876]: I0313 10:43:55.275926 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:43:55.325602 master-0 kubenswrapper[17876]: I0313 10:43:55.324024 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:43:55.341825 master-0 kubenswrapper[17876]: I0313 10:43:55.341723 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:43:55.352532 master-0 kubenswrapper[17876]: I0313 10:43:55.352448 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:43:55.354178 master-0 kubenswrapper[17876]: I0313 10:43:55.354111 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:43:55.439570 master-0 kubenswrapper[17876]: I0313 10:43:55.439424 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:43:55.505561 master-0 kubenswrapper[17876]: I0313 10:43:55.505445 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 10:43:55.590754 master-0 kubenswrapper[17876]: I0313 10:43:55.590641 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 10:43:55.906557 master-0 kubenswrapper[17876]: I0313 10:43:55.906492 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:43:55.923708 master-0 kubenswrapper[17876]: I0313 10:43:55.923640 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:43:56.016902 master-0 kubenswrapper[17876]: I0313 10:43:56.016826 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 10:43:56.105053 master-0 kubenswrapper[17876]: I0313 10:43:56.105008 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:43:56.163461 master-0 kubenswrapper[17876]: I0313 10:43:56.163359 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 10:43:56.267172 master-0 kubenswrapper[17876]: I0313 10:43:56.267078 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-jpsrw" Mar 13 10:43:56.570143 master-0 kubenswrapper[17876]: I0313 10:43:56.569917 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:43:56.625024 master-0 kubenswrapper[17876]: I0313 10:43:56.624946 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:43:56.696634 master-0 kubenswrapper[17876]: I0313 10:43:56.696564 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:43:56.731905 master-0 kubenswrapper[17876]: I0313 10:43:56.731813 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 10:43:56.797632 master-0 kubenswrapper[17876]: I0313 10:43:56.797553 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:43:56.869986 master-0 kubenswrapper[17876]: I0313 10:43:56.869828 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 10:43:56.952892 master-0 kubenswrapper[17876]: I0313 10:43:56.952818 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:43:57.016383 master-0 kubenswrapper[17876]: I0313 10:43:57.016327 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 10:43:57.049308 master-0 kubenswrapper[17876]: I0313 10:43:57.049250 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 10:43:57.063461 master-0 kubenswrapper[17876]: I0313 10:43:57.063406 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:43:57.223465 master-0 kubenswrapper[17876]: I0313 10:43:57.223292 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:43:57.276762 master-0 kubenswrapper[17876]: I0313 10:43:57.276676 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:43:57.338042 master-0 kubenswrapper[17876]: I0313 10:43:57.337982 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 10:43:57.365073 master-0 kubenswrapper[17876]: I0313 10:43:57.365012 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 10:43:57.440848 master-0 kubenswrapper[17876]: I0313 10:43:57.440794 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:43:57.606782 master-0 kubenswrapper[17876]: I0313 10:43:57.606594 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 10:43:57.680707 master-0 kubenswrapper[17876]: I0313 10:43:57.680646 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:43:57.740724 master-0 kubenswrapper[17876]: I0313 10:43:57.740647 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:43:57.832597 master-0 kubenswrapper[17876]: I0313 10:43:57.832506 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-hggr7" Mar 13 10:43:57.890365 master-0 kubenswrapper[17876]: I0313 10:43:57.890304 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 10:43:57.964714 master-0 kubenswrapper[17876]: I0313 10:43:57.964615 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 10:43:58.421629 master-0 kubenswrapper[17876]: I0313 10:43:58.421562 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:43:58.694321 master-0 kubenswrapper[17876]: I0313 10:43:58.694139 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 10:43:59.054395 master-0 kubenswrapper[17876]: I0313 10:43:59.053533 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 10:43:59.054395 master-0 kubenswrapper[17876]: I0313 10:43:59.053664 17876 generic.go:334] "Generic (PLEG): container finished" podID="899242a15b2bdf3b4a04fb323647ca94" containerID="f39c34196fdd75fee093e6d7425f196a9c2aea8d2fd22351895c1d6588e8828c" exitCode=137 Mar 13 10:43:59.334708 master-0 kubenswrapper[17876]: I0313 10:43:59.334590 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 10:43:59.469566 master-0 kubenswrapper[17876]: I0313 10:43:59.469516 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 10:43:59.470143 master-0 kubenswrapper[17876]: I0313 10:43:59.469618 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:43:59.534842 master-0 kubenswrapper[17876]: I0313 10:43:59.534783 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:43:59.628915 master-0 kubenswrapper[17876]: I0313 10:43:59.628855 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:43:59.648685 master-0 kubenswrapper[17876]: I0313 10:43:59.648571 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648705 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648781 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648883 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648916 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") pod \"899242a15b2bdf3b4a04fb323647ca94\" (UID: \"899242a15b2bdf3b4a04fb323647ca94\") " Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648958 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock" (OuterVolumeSpecName: "var-lock") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.648949 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log" (OuterVolumeSpecName: "var-log") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649013 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649058 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests" (OuterVolumeSpecName: "manifests") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649307 17876 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649336 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649353 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:59.649418 master-0 kubenswrapper[17876]: I0313 10:43:59.649367 17876 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 10:43:59.659232 master-0 kubenswrapper[17876]: I0313 10:43:59.656426 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "899242a15b2bdf3b4a04fb323647ca94" (UID: "899242a15b2bdf3b4a04fb323647ca94"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:43:59.750441 master-0 kubenswrapper[17876]: I0313 10:43:59.750334 17876 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/899242a15b2bdf3b4a04fb323647ca94-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:00.060158 master-0 kubenswrapper[17876]: I0313 10:44:00.059993 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_899242a15b2bdf3b4a04fb323647ca94/startup-monitor/0.log" Mar 13 10:44:00.060158 master-0 kubenswrapper[17876]: I0313 10:44:00.060071 17876 scope.go:117] "RemoveContainer" containerID="f39c34196fdd75fee093e6d7425f196a9c2aea8d2fd22351895c1d6588e8828c" Mar 13 10:44:00.060485 master-0 kubenswrapper[17876]: I0313 10:44:00.060195 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:44:00.330555 master-0 kubenswrapper[17876]: I0313 10:44:00.330355 17876 patch_prober.go:28] interesting pod/console-bfb55f4b6-qf9q7 container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" start-of-body= Mar 13 10:44:00.330555 master-0 kubenswrapper[17876]: I0313 10:44:00.330413 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" probeResult="failure" output="Get \"https://10.128.0.86:8443/health\": dial tcp 10.128.0.86:8443: connect: connection refused" Mar 13 10:44:00.500853 master-0 kubenswrapper[17876]: I0313 10:44:00.500743 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="899242a15b2bdf3b4a04fb323647ca94" path="/var/lib/kubelet/pods/899242a15b2bdf3b4a04fb323647ca94/volumes" Mar 13 10:44:00.502073 master-0 kubenswrapper[17876]: I0313 10:44:00.501029 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 13 10:44:00.515719 master-0 kubenswrapper[17876]: I0313 10:44:00.515627 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:44:00.515719 master-0 kubenswrapper[17876]: I0313 10:44:00.515682 17876 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f74cfb55-5bfd-4b75-b72c-639f4b5a5bd4" Mar 13 10:44:00.520477 master-0 kubenswrapper[17876]: I0313 10:44:00.520387 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:44:00.520477 master-0 kubenswrapper[17876]: I0313 10:44:00.520462 17876 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="f74cfb55-5bfd-4b75-b72c-639f4b5a5bd4" Mar 13 10:44:00.677225 master-0 kubenswrapper[17876]: I0313 10:44:00.677168 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:44:01.059016 master-0 kubenswrapper[17876]: I0313 10:44:01.058902 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:44:01.061465 master-0 kubenswrapper[17876]: W0313 10:44:01.061409 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0673d5a0_3ff3_4d30_995b_829d3f165071.slice/crio-7540fa4219d6a8910c174ebbf42745831c7ed3ff23c260516d4014f136e1f42c WatchSource:0}: Error finding container 7540fa4219d6a8910c174ebbf42745831c7ed3ff23c260516d4014f136e1f42c: Status 404 returned error can't find the container with id 7540fa4219d6a8910c174ebbf42745831c7ed3ff23c260516d4014f136e1f42c Mar 13 10:44:02.123696 master-0 kubenswrapper[17876]: I0313 10:44:02.123587 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" event={"ID":"0673d5a0-3ff3-4d30-995b-829d3f165071","Type":"ContainerStarted","Data":"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7"} Mar 13 10:44:02.125476 master-0 kubenswrapper[17876]: I0313 10:44:02.125334 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" event={"ID":"0673d5a0-3ff3-4d30-995b-829d3f165071","Type":"ContainerStarted","Data":"7540fa4219d6a8910c174ebbf42745831c7ed3ff23c260516d4014f136e1f42c"} Mar 13 10:44:02.125638 master-0 kubenswrapper[17876]: I0313 10:44:02.125620 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:44:02.136180 master-0 kubenswrapper[17876]: I0313 10:44:02.136120 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:44:02.160034 master-0 kubenswrapper[17876]: I0313 10:44:02.159918 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" podStartSLOduration=69.159895355 podStartE2EDuration="1m9.159895355s" podCreationTimestamp="2026-03-13 10:42:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:44:02.155563369 +0000 UTC m=+149.991369855" watchObservedRunningTime="2026-03-13 10:44:02.159895355 +0000 UTC m=+149.995701821" Mar 13 10:44:02.737964 master-0 kubenswrapper[17876]: I0313 10:44:02.737876 17876 patch_prober.go:28] interesting pod/console-7776f76bf7-f4jhw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" start-of-body= Mar 13 10:44:02.738259 master-0 kubenswrapper[17876]: I0313 10:44:02.737986 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" probeResult="failure" output="Get \"https://10.128.0.84:8443/health\": dial tcp 10.128.0.84:8443: connect: connection refused" Mar 13 10:44:10.337016 master-0 kubenswrapper[17876]: I0313 10:44:10.336752 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:44:10.345453 master-0 kubenswrapper[17876]: I0313 10:44:10.345352 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:44:10.435666 master-0 kubenswrapper[17876]: I0313 10:44:10.435570 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:44:32.824805 master-0 kubenswrapper[17876]: I0313 10:44:32.824626 17876 scope.go:117] "RemoveContainer" containerID="8f137541b8024be9dec3a0e2a3bb479dfd8210f470244154f734979cdb98e7ff" Mar 13 10:44:35.504423 master-0 kubenswrapper[17876]: I0313 10:44:35.504153 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-7776f76bf7-f4jhw" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" containerID="cri-o://5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426" gracePeriod=15 Mar 13 10:44:36.151842 master-0 kubenswrapper[17876]: I0313 10:44:36.151796 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7776f76bf7-f4jhw_9413fefe-20d4-4f4c-939a-c9d45eda6032/console/0.log" Mar 13 10:44:36.152133 master-0 kubenswrapper[17876]: I0313 10:44:36.151916 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:44:36.269901 master-0 kubenswrapper[17876]: I0313 10:44:36.269804 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.269901 master-0 kubenswrapper[17876]: I0313 10:44:36.269884 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.269901 master-0 kubenswrapper[17876]: I0313 10:44:36.269914 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.270537 master-0 kubenswrapper[17876]: I0313 10:44:36.270076 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.270537 master-0 kubenswrapper[17876]: I0313 10:44:36.270255 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tb59r\" (UniqueName: \"kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.270537 master-0 kubenswrapper[17876]: I0313 10:44:36.270292 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config\") pod \"9413fefe-20d4-4f4c-939a-c9d45eda6032\" (UID: \"9413fefe-20d4-4f4c-939a-c9d45eda6032\") " Mar 13 10:44:36.270782 master-0 kubenswrapper[17876]: I0313 10:44:36.270641 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:44:36.270782 master-0 kubenswrapper[17876]: I0313 10:44:36.270668 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca" (OuterVolumeSpecName: "service-ca") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:44:36.270943 master-0 kubenswrapper[17876]: I0313 10:44:36.270810 17876 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.270943 master-0 kubenswrapper[17876]: I0313 10:44:36.270827 17876 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.271355 master-0 kubenswrapper[17876]: I0313 10:44:36.271297 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config" (OuterVolumeSpecName: "console-config") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:44:36.274496 master-0 kubenswrapper[17876]: I0313 10:44:36.274443 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:44:36.274636 master-0 kubenswrapper[17876]: I0313 10:44:36.274508 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:44:36.274782 master-0 kubenswrapper[17876]: I0313 10:44:36.274727 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r" (OuterVolumeSpecName: "kube-api-access-tb59r") pod "9413fefe-20d4-4f4c-939a-c9d45eda6032" (UID: "9413fefe-20d4-4f4c-939a-c9d45eda6032"). InnerVolumeSpecName "kube-api-access-tb59r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:44:36.372039 master-0 kubenswrapper[17876]: I0313 10:44:36.371971 17876 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.372039 master-0 kubenswrapper[17876]: I0313 10:44:36.372015 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tb59r\" (UniqueName: \"kubernetes.io/projected/9413fefe-20d4-4f4c-939a-c9d45eda6032-kube-api-access-tb59r\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.372039 master-0 kubenswrapper[17876]: I0313 10:44:36.372032 17876 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.372039 master-0 kubenswrapper[17876]: I0313 10:44:36.372047 17876 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9413fefe-20d4-4f4c-939a-c9d45eda6032-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:44:36.441689 master-0 kubenswrapper[17876]: I0313 10:44:36.441640 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7776f76bf7-f4jhw_9413fefe-20d4-4f4c-939a-c9d45eda6032/console/0.log" Mar 13 10:44:36.441925 master-0 kubenswrapper[17876]: I0313 10:44:36.441738 17876 generic.go:334] "Generic (PLEG): container finished" podID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerID="5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426" exitCode=2 Mar 13 10:44:36.441925 master-0 kubenswrapper[17876]: I0313 10:44:36.441788 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7776f76bf7-f4jhw" Mar 13 10:44:36.441925 master-0 kubenswrapper[17876]: I0313 10:44:36.441818 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7776f76bf7-f4jhw" event={"ID":"9413fefe-20d4-4f4c-939a-c9d45eda6032","Type":"ContainerDied","Data":"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426"} Mar 13 10:44:36.441925 master-0 kubenswrapper[17876]: I0313 10:44:36.441880 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7776f76bf7-f4jhw" event={"ID":"9413fefe-20d4-4f4c-939a-c9d45eda6032","Type":"ContainerDied","Data":"f42a30a155db9e007eb11ac80aa29a3ec9d8d56dd287aa15dcc220c47296ddfa"} Mar 13 10:44:36.441925 master-0 kubenswrapper[17876]: I0313 10:44:36.441909 17876 scope.go:117] "RemoveContainer" containerID="5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426" Mar 13 10:44:36.458734 master-0 kubenswrapper[17876]: I0313 10:44:36.458669 17876 scope.go:117] "RemoveContainer" containerID="5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426" Mar 13 10:44:36.459402 master-0 kubenswrapper[17876]: E0313 10:44:36.459366 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426\": container with ID starting with 5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426 not found: ID does not exist" containerID="5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426" Mar 13 10:44:36.459519 master-0 kubenswrapper[17876]: I0313 10:44:36.459405 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426"} err="failed to get container status \"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426\": rpc error: code = NotFound desc = could not find container \"5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426\": container with ID starting with 5efb79fbb8f256de37c3a28bc503466c78150d345b1f6f406cd2c2e543461426 not found: ID does not exist" Mar 13 10:44:36.501566 master-0 kubenswrapper[17876]: I0313 10:44:36.501479 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:44:36.503216 master-0 kubenswrapper[17876]: I0313 10:44:36.503170 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7776f76bf7-f4jhw"] Mar 13 10:44:38.501357 master-0 kubenswrapper[17876]: I0313 10:44:38.501285 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" path="/var/lib/kubelet/pods/9413fefe-20d4-4f4c-939a-c9d45eda6032/volumes" Mar 13 10:46:08.636069 master-0 kubenswrapper[17876]: I0313 10:46:08.635940 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: E0313 10:46:08.636538 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: I0313 10:46:08.636607 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: E0313 10:46:08.636663 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: I0313 10:46:08.636672 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: I0313 10:46:08.636938 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="899242a15b2bdf3b4a04fb323647ca94" containerName="startup-monitor" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: I0313 10:46:08.636962 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="9413fefe-20d4-4f4c-939a-c9d45eda6032" containerName="console" Mar 13 10:46:08.638260 master-0 kubenswrapper[17876]: I0313 10:46:08.638034 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.642566 master-0 kubenswrapper[17876]: I0313 10:46:08.642424 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 10:46:08.642934 master-0 kubenswrapper[17876]: I0313 10:46:08.642565 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wbwm2" Mar 13 10:46:08.667565 master-0 kubenswrapper[17876]: I0313 10:46:08.666458 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 13 10:46:08.836689 master-0 kubenswrapper[17876]: I0313 10:46:08.836633 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.836689 master-0 kubenswrapper[17876]: I0313 10:46:08.836695 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.836993 master-0 kubenswrapper[17876]: I0313 10:46:08.836774 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.938073 master-0 kubenswrapper[17876]: I0313 10:46:08.937950 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.938073 master-0 kubenswrapper[17876]: I0313 10:46:08.938029 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.938340 master-0 kubenswrapper[17876]: I0313 10:46:08.938133 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.938404 master-0 kubenswrapper[17876]: I0313 10:46:08.938323 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.938587 master-0 kubenswrapper[17876]: I0313 10:46:08.938546 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.969807 master-0 kubenswrapper[17876]: I0313 10:46:08.969719 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:08.983447 master-0 kubenswrapper[17876]: I0313 10:46:08.983389 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:09.520772 master-0 kubenswrapper[17876]: I0313 10:46:09.520699 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-retry-1-master-0"] Mar 13 10:46:10.243086 master-0 kubenswrapper[17876]: I0313 10:46:10.243003 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"9c685b3c-644d-4253-9fac-6c03fbeed2d5","Type":"ContainerStarted","Data":"a946bd91b2a1464e4bdd327bbf2e60c161f43dc19ba7c51e8dede98c1bc87d04"} Mar 13 10:46:10.243086 master-0 kubenswrapper[17876]: I0313 10:46:10.243079 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"9c685b3c-644d-4253-9fac-6c03fbeed2d5","Type":"ContainerStarted","Data":"0dc9052d86efd503f76116a18369d7fc173cc0e49357b9b1a65840c8ca34da4d"} Mar 13 10:46:10.265314 master-0 kubenswrapper[17876]: I0313 10:46:10.265165 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" podStartSLOduration=2.265127145 podStartE2EDuration="2.265127145s" podCreationTimestamp="2026-03-13 10:46:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:46:10.261393715 +0000 UTC m=+278.097200241" watchObservedRunningTime="2026-03-13 10:46:10.265127145 +0000 UTC m=+278.100933621" Mar 13 10:46:18.960021 master-0 kubenswrapper[17876]: I0313 10:46:18.959923 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 10:46:18.961207 master-0 kubenswrapper[17876]: I0313 10:46:18.961169 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:18.963372 master-0 kubenswrapper[17876]: I0313 10:46:18.963278 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:46:18.964593 master-0 kubenswrapper[17876]: I0313 10:46:18.964539 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:46:19.031785 master-0 kubenswrapper[17876]: I0313 10:46:19.031724 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 10:46:19.062711 master-0 kubenswrapper[17876]: I0313 10:46:19.062567 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.062711 master-0 kubenswrapper[17876]: I0313 10:46:19.062707 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.063028 master-0 kubenswrapper[17876]: I0313 10:46:19.062791 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.163490 master-0 kubenswrapper[17876]: I0313 10:46:19.163425 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.163719 master-0 kubenswrapper[17876]: I0313 10:46:19.163511 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.163719 master-0 kubenswrapper[17876]: I0313 10:46:19.163556 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.163719 master-0 kubenswrapper[17876]: I0313 10:46:19.163565 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.163719 master-0 kubenswrapper[17876]: I0313 10:46:19.163670 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.184329 master-0 kubenswrapper[17876]: I0313 10:46:19.184266 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access\") pod \"installer-4-master-0\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.282661 master-0 kubenswrapper[17876]: I0313 10:46:19.282440 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:19.700924 master-0 kubenswrapper[17876]: I0313 10:46:19.700864 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 13 10:46:20.441906 master-0 kubenswrapper[17876]: I0313 10:46:20.441817 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f57bfc81-1c24-4b56-be43-08a173a82b76","Type":"ContainerStarted","Data":"cb729ad0e1626dd8b0150006e31be5ecd648bdaf7e7a26953eb61e56168cbdf3"} Mar 13 10:46:20.441906 master-0 kubenswrapper[17876]: I0313 10:46:20.441905 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f57bfc81-1c24-4b56-be43-08a173a82b76","Type":"ContainerStarted","Data":"8b7339795fbc7798a7e32d45b2db1baa834187107e63224c4e430fc52fcca69e"} Mar 13 10:46:20.465853 master-0 kubenswrapper[17876]: I0313 10:46:20.465740 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.465709799 podStartE2EDuration="2.465709799s" podCreationTimestamp="2026-03-13 10:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:46:20.465637407 +0000 UTC m=+288.301443913" watchObservedRunningTime="2026-03-13 10:46:20.465709799 +0000 UTC m=+288.301516285" Mar 13 10:46:31.403550 master-0 kubenswrapper[17876]: I0313 10:46:31.403483 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8"] Mar 13 10:46:31.405031 master-0 kubenswrapper[17876]: I0313 10:46:31.405003 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.407190 master-0 kubenswrapper[17876]: I0313 10:46:31.407146 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 10:46:31.407818 master-0 kubenswrapper[17876]: I0313 10:46:31.407758 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-vggdd" Mar 13 10:46:31.408921 master-0 kubenswrapper[17876]: I0313 10:46:31.408875 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 10:46:31.428344 master-0 kubenswrapper[17876]: I0313 10:46:31.428285 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8"] Mar 13 10:46:31.435293 master-0 kubenswrapper[17876]: I0313 10:46:31.435230 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-f9qr8"] Mar 13 10:46:31.436815 master-0 kubenswrapper[17876]: I0313 10:46:31.436769 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.439968 master-0 kubenswrapper[17876]: I0313 10:46:31.439925 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-9r4nm" Mar 13 10:46:31.442440 master-0 kubenswrapper[17876]: I0313 10:46:31.439935 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 10:46:31.444389 master-0 kubenswrapper[17876]: I0313 10:46:31.444357 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 10:46:31.530307 master-0 kubenswrapper[17876]: I0313 10:46:31.530150 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.530307 master-0 kubenswrapper[17876]: I0313 10:46:31.530218 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a759b9-0345-408a-a231-def20aeee523-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.530307 master-0 kubenswrapper[17876]: I0313 10:46:31.530257 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-textfile\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.530307 master-0 kubenswrapper[17876]: I0313 10:46:31.530303 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6pqf\" (UniqueName: \"kubernetes.io/projected/f3a759b9-0345-408a-a231-def20aeee523-kube-api-access-v6pqf\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.530307 master-0 kubenswrapper[17876]: I0313 10:46:31.530324 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-root\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531274 master-0 kubenswrapper[17876]: I0313 10:46:31.530451 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-tls\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531389 master-0 kubenswrapper[17876]: I0313 10:46:31.531335 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d70466f4-da4f-429d-837a-94d1ede9d7ca-metrics-client-ca\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531479 master-0 kubenswrapper[17876]: I0313 10:46:31.531452 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-wtmp\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531548 master-0 kubenswrapper[17876]: I0313 10:46:31.531534 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4msl\" (UniqueName: \"kubernetes.io/projected/d70466f4-da4f-429d-837a-94d1ede9d7ca-kube-api-access-x4msl\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531605 master-0 kubenswrapper[17876]: I0313 10:46:31.531571 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.531605 master-0 kubenswrapper[17876]: I0313 10:46:31.531596 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-sys\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.531878 master-0 kubenswrapper[17876]: I0313 10:46:31.531848 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.634077 master-0 kubenswrapper[17876]: I0313 10:46:31.633998 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4msl\" (UniqueName: \"kubernetes.io/projected/d70466f4-da4f-429d-837a-94d1ede9d7ca-kube-api-access-x4msl\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634077 master-0 kubenswrapper[17876]: I0313 10:46:31.634073 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.634077 master-0 kubenswrapper[17876]: I0313 10:46:31.634114 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-sys\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634143 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634170 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634198 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a759b9-0345-408a-a231-def20aeee523-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634227 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-textfile\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634256 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v6pqf\" (UniqueName: \"kubernetes.io/projected/f3a759b9-0345-408a-a231-def20aeee523-kube-api-access-v6pqf\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634275 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-root\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634299 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-tls\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634344 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d70466f4-da4f-429d-837a-94d1ede9d7ca-metrics-client-ca\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634482 master-0 kubenswrapper[17876]: I0313 10:46:31.634363 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-wtmp\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.634812 master-0 kubenswrapper[17876]: I0313 10:46:31.634666 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-wtmp\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.638125 master-0 kubenswrapper[17876]: I0313 10:46:31.635130 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-textfile\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.638125 master-0 kubenswrapper[17876]: I0313 10:46:31.635360 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-sys\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.638125 master-0 kubenswrapper[17876]: I0313 10:46:31.635699 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/d70466f4-da4f-429d-837a-94d1ede9d7ca-root\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.638125 master-0 kubenswrapper[17876]: I0313 10:46:31.635701 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f3a759b9-0345-408a-a231-def20aeee523-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.638125 master-0 kubenswrapper[17876]: I0313 10:46:31.636488 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/d70466f4-da4f-429d-837a-94d1ede9d7ca-metrics-client-ca\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.639905 master-0 kubenswrapper[17876]: I0313 10:46:31.638703 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.639905 master-0 kubenswrapper[17876]: I0313 10:46:31.638938 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.641407 master-0 kubenswrapper[17876]: I0313 10:46:31.641376 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f3a759b9-0345-408a-a231-def20aeee523-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.642029 master-0 kubenswrapper[17876]: I0313 10:46:31.642007 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/d70466f4-da4f-429d-837a-94d1ede9d7ca-node-exporter-tls\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.653594 master-0 kubenswrapper[17876]: I0313 10:46:31.653526 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4msl\" (UniqueName: \"kubernetes.io/projected/d70466f4-da4f-429d-837a-94d1ede9d7ca-kube-api-access-x4msl\") pod \"node-exporter-f9qr8\" (UID: \"d70466f4-da4f-429d-837a-94d1ede9d7ca\") " pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:31.654856 master-0 kubenswrapper[17876]: I0313 10:46:31.654811 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6pqf\" (UniqueName: \"kubernetes.io/projected/f3a759b9-0345-408a-a231-def20aeee523-kube-api-access-v6pqf\") pod \"openshift-state-metrics-74cc79fd76-sslt8\" (UID: \"f3a759b9-0345-408a-a231-def20aeee523\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.724217 master-0 kubenswrapper[17876]: I0313 10:46:31.724167 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" Mar 13 10:46:31.770388 master-0 kubenswrapper[17876]: I0313 10:46:31.768090 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-f9qr8" Mar 13 10:46:32.023311 master-0 kubenswrapper[17876]: I0313 10:46:32.019803 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 10:46:32.023311 master-0 kubenswrapper[17876]: I0313 10:46:32.021746 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.024430 master-0 kubenswrapper[17876]: I0313 10:46:32.024394 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 10:46:32.025430 master-0 kubenswrapper[17876]: I0313 10:46:32.025400 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-h29h2" Mar 13 10:46:32.028257 master-0 kubenswrapper[17876]: I0313 10:46:32.028190 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 10:46:32.040378 master-0 kubenswrapper[17876]: I0313 10:46:32.040293 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.040378 master-0 kubenswrapper[17876]: I0313 10:46:32.040339 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.040954 master-0 kubenswrapper[17876]: I0313 10:46:32.040462 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.141921 master-0 kubenswrapper[17876]: I0313 10:46:32.141798 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.142255 master-0 kubenswrapper[17876]: I0313 10:46:32.141931 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.142255 master-0 kubenswrapper[17876]: I0313 10:46:32.141997 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.142255 master-0 kubenswrapper[17876]: I0313 10:46:32.142034 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.142255 master-0 kubenswrapper[17876]: I0313 10:46:32.142071 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.153562 master-0 kubenswrapper[17876]: I0313 10:46:32.153484 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8"] Mar 13 10:46:32.157917 master-0 kubenswrapper[17876]: W0313 10:46:32.157825 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3a759b9_0345_408a_a231_def20aeee523.slice/crio-17922b71550f0e50549be4910fb11221749e5faf2f9951aabf2a88759504ae63 WatchSource:0}: Error finding container 17922b71550f0e50549be4910fb11221749e5faf2f9951aabf2a88759504ae63: Status 404 returned error can't find the container with id 17922b71550f0e50549be4910fb11221749e5faf2f9951aabf2a88759504ae63 Mar 13 10:46:32.159027 master-0 kubenswrapper[17876]: I0313 10:46:32.158980 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access\") pod \"installer-2-master-0\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.363977 master-0 kubenswrapper[17876]: I0313 10:46:32.363909 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 10:46:32.499903 master-0 kubenswrapper[17876]: I0313 10:46:32.499356 17876 kubelet.go:1505] "Image garbage collection succeeded" Mar 13 10:46:32.550647 master-0 kubenswrapper[17876]: I0313 10:46:32.550577 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" event={"ID":"f3a759b9-0345-408a-a231-def20aeee523","Type":"ContainerStarted","Data":"82b828939fdb61023216432bcb6c4e7023dc4eb982fc4605ad9ccd4ed080df50"} Mar 13 10:46:32.550647 master-0 kubenswrapper[17876]: I0313 10:46:32.550643 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" event={"ID":"f3a759b9-0345-408a-a231-def20aeee523","Type":"ContainerStarted","Data":"cfefc9ff891b02624966bd7858153dfc669e39f93a19d39320414259a1fc46a8"} Mar 13 10:46:32.550822 master-0 kubenswrapper[17876]: I0313 10:46:32.550656 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" event={"ID":"f3a759b9-0345-408a-a231-def20aeee523","Type":"ContainerStarted","Data":"17922b71550f0e50549be4910fb11221749e5faf2f9951aabf2a88759504ae63"} Mar 13 10:46:32.552392 master-0 kubenswrapper[17876]: I0313 10:46:32.552350 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f9qr8" event={"ID":"d70466f4-da4f-429d-837a-94d1ede9d7ca","Type":"ContainerStarted","Data":"d6cfd6a48645a1fceb0eeceac54679c0bec56b5bf055d2795653a8e9eb3d928f"} Mar 13 10:46:32.689229 master-0 kubenswrapper[17876]: I0313 10:46:32.689183 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 10:46:32.694162 master-0 kubenswrapper[17876]: I0313 10:46:32.693757 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.696739 master-0 kubenswrapper[17876]: I0313 10:46:32.696692 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 10:46:32.697016 master-0 kubenswrapper[17876]: I0313 10:46:32.696975 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 10:46:32.697573 master-0 kubenswrapper[17876]: I0313 10:46:32.697537 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 10:46:32.697885 master-0 kubenswrapper[17876]: I0313 10:46:32.697840 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 10:46:32.698064 master-0 kubenswrapper[17876]: I0313 10:46:32.698030 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 10:46:32.698167 master-0 kubenswrapper[17876]: I0313 10:46:32.698068 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 10:46:32.698233 master-0 kubenswrapper[17876]: I0313 10:46:32.698223 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 10:46:32.708229 master-0 kubenswrapper[17876]: I0313 10:46:32.708190 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 10:46:32.723640 master-0 kubenswrapper[17876]: I0313 10:46:32.722702 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 10:46:32.849150 master-0 kubenswrapper[17876]: I0313 10:46:32.848986 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 13 10:46:32.852499 master-0 kubenswrapper[17876]: I0313 10:46:32.852471 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852632 master-0 kubenswrapper[17876]: I0313 10:46:32.852507 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852632 master-0 kubenswrapper[17876]: I0313 10:46:32.852603 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-out\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852828 master-0 kubenswrapper[17876]: I0313 10:46:32.852683 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852828 master-0 kubenswrapper[17876]: I0313 10:46:32.852737 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852828 master-0 kubenswrapper[17876]: I0313 10:46:32.852765 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.852828 master-0 kubenswrapper[17876]: I0313 10:46:32.852808 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.853118 master-0 kubenswrapper[17876]: I0313 10:46:32.852842 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.853118 master-0 kubenswrapper[17876]: I0313 10:46:32.852868 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.853118 master-0 kubenswrapper[17876]: I0313 10:46:32.852957 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-web-config\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.853118 master-0 kubenswrapper[17876]: I0313 10:46:32.852993 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.853118 master-0 kubenswrapper[17876]: I0313 10:46:32.853024 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkknq\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-kube-api-access-zkknq\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.954465 master-0 kubenswrapper[17876]: I0313 10:46:32.954340 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.954960 master-0 kubenswrapper[17876]: I0313 10:46:32.954849 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.954960 master-0 kubenswrapper[17876]: I0313 10:46:32.954926 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.955271 master-0 kubenswrapper[17876]: I0313 10:46:32.954971 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.955271 master-0 kubenswrapper[17876]: I0313 10:46:32.954999 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.955417 master-0 kubenswrapper[17876]: I0313 10:46:32.955295 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-web-config\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.955417 master-0 kubenswrapper[17876]: I0313 10:46:32.955336 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.955417 master-0 kubenswrapper[17876]: I0313 10:46:32.955388 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkknq\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-kube-api-access-zkknq\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.955431 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.955455 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.955486 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-out\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.955592 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.955908 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.958537 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.959330 master-0 kubenswrapper[17876]: I0313 10:46:32.958644 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.960556 master-0 kubenswrapper[17876]: I0313 10:46:32.959756 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.961464 master-0 kubenswrapper[17876]: I0313 10:46:32.961404 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-web-config\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.963148 master-0 kubenswrapper[17876]: I0313 10:46:32.962592 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-tls-assets\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.963842 master-0 kubenswrapper[17876]: I0313 10:46:32.963364 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-volume\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.963842 master-0 kubenswrapper[17876]: I0313 10:46:32.963667 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/43106b48-53ec-4da5-9cc3-6e5f10add3ec-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.966523 master-0 kubenswrapper[17876]: I0313 10:46:32.966472 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.967172 master-0 kubenswrapper[17876]: I0313 10:46:32.967128 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/43106b48-53ec-4da5-9cc3-6e5f10add3ec-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.969072 master-0 kubenswrapper[17876]: I0313 10:46:32.968991 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/43106b48-53ec-4da5-9cc3-6e5f10add3ec-config-out\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:32.972137 master-0 kubenswrapper[17876]: I0313 10:46:32.972092 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkknq\" (UniqueName: \"kubernetes.io/projected/43106b48-53ec-4da5-9cc3-6e5f10add3ec-kube-api-access-zkknq\") pod \"alertmanager-main-0\" (UID: \"43106b48-53ec-4da5-9cc3-6e5f10add3ec\") " pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:33.030809 master-0 kubenswrapper[17876]: I0313 10:46:33.030742 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 13 10:46:33.664553 master-0 kubenswrapper[17876]: I0313 10:46:33.664491 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960","Type":"ContainerStarted","Data":"24c84f241d3e7c5bd250d1c21d977eb70d1ed2e63b98759512919e0541c9e2e3"} Mar 13 10:46:33.664553 master-0 kubenswrapper[17876]: I0313 10:46:33.664546 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960","Type":"ContainerStarted","Data":"30f03912a2a587ba44133889b8b9353c7c43c3787cc5614c5915702674574e15"} Mar 13 10:46:33.706086 master-0 kubenswrapper[17876]: I0313 10:46:33.704312 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=2.704284413 podStartE2EDuration="2.704284413s" podCreationTimestamp="2026-03-13 10:46:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:46:33.696183252 +0000 UTC m=+301.531989728" watchObservedRunningTime="2026-03-13 10:46:33.704284413 +0000 UTC m=+301.540090889" Mar 13 10:46:33.771123 master-0 kubenswrapper[17876]: I0313 10:46:33.770326 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-66f5cbb768-c7m4c"] Mar 13 10:46:33.783209 master-0 kubenswrapper[17876]: I0313 10:46:33.782557 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.791114 master-0 kubenswrapper[17876]: I0313 10:46:33.789051 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 10:46:33.791114 master-0 kubenswrapper[17876]: I0313 10:46:33.789292 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 10:46:33.791114 master-0 kubenswrapper[17876]: I0313 10:46:33.789397 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dr74kngie93ut" Mar 13 10:46:33.791114 master-0 kubenswrapper[17876]: I0313 10:46:33.789534 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 10:46:33.791114 master-0 kubenswrapper[17876]: I0313 10:46:33.789630 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 10:46:33.798680 master-0 kubenswrapper[17876]: I0313 10:46:33.798612 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 10:46:33.814399 master-0 kubenswrapper[17876]: I0313 10:46:33.813613 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66f5cbb768-c7m4c"] Mar 13 10:46:33.921132 master-0 kubenswrapper[17876]: I0313 10:46:33.921060 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921283 master-0 kubenswrapper[17876]: I0313 10:46:33.921154 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921283 master-0 kubenswrapper[17876]: I0313 10:46:33.921180 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5wbh\" (UniqueName: \"kubernetes.io/projected/85999993-ef48-472a-9b16-558097d888f2-kube-api-access-g5wbh\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921283 master-0 kubenswrapper[17876]: I0313 10:46:33.921222 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921283 master-0 kubenswrapper[17876]: I0313 10:46:33.921238 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921483 master-0 kubenswrapper[17876]: I0313 10:46:33.921310 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85999993-ef48-472a-9b16-558097d888f2-metrics-client-ca\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921483 master-0 kubenswrapper[17876]: I0313 10:46:33.921381 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:33.921483 master-0 kubenswrapper[17876]: I0313 10:46:33.921399 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-grpc-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.000764 master-0 kubenswrapper[17876]: I0313 10:46:34.000715 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022419 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022487 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022509 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5wbh\" (UniqueName: \"kubernetes.io/projected/85999993-ef48-472a-9b16-558097d888f2-kube-api-access-g5wbh\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022551 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022569 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.022601 master-0 kubenswrapper[17876]: I0313 10:46:34.022595 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85999993-ef48-472a-9b16-558097d888f2-metrics-client-ca\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.023068 master-0 kubenswrapper[17876]: I0313 10:46:34.022630 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-grpc-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.023068 master-0 kubenswrapper[17876]: I0313 10:46:34.022657 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.023946 master-0 kubenswrapper[17876]: I0313 10:46:34.023915 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/85999993-ef48-472a-9b16-558097d888f2-metrics-client-ca\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.026834 master-0 kubenswrapper[17876]: I0313 10:46:34.026788 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.026962 master-0 kubenswrapper[17876]: I0313 10:46:34.026871 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.027011 master-0 kubenswrapper[17876]: I0313 10:46:34.026962 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-grpc-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.027011 master-0 kubenswrapper[17876]: I0313 10:46:34.026974 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.027473 master-0 kubenswrapper[17876]: I0313 10:46:34.027425 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-tls\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.028875 master-0 kubenswrapper[17876]: I0313 10:46:34.028471 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/85999993-ef48-472a-9b16-558097d888f2-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.039082 master-0 kubenswrapper[17876]: I0313 10:46:34.039037 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5wbh\" (UniqueName: \"kubernetes.io/projected/85999993-ef48-472a-9b16-558097d888f2-kube-api-access-g5wbh\") pod \"thanos-querier-66f5cbb768-c7m4c\" (UID: \"85999993-ef48-472a-9b16-558097d888f2\") " pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.167688 master-0 kubenswrapper[17876]: I0313 10:46:34.167547 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:34.226217 master-0 kubenswrapper[17876]: W0313 10:46:34.226113 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43106b48_53ec_4da5_9cc3_6e5f10add3ec.slice/crio-e1aa080d0880cbca606e63c868acc1acc6532bfc624aa0d5694a59abd6faa167 WatchSource:0}: Error finding container e1aa080d0880cbca606e63c868acc1acc6532bfc624aa0d5694a59abd6faa167: Status 404 returned error can't find the container with id e1aa080d0880cbca606e63c868acc1acc6532bfc624aa0d5694a59abd6faa167 Mar 13 10:46:34.680445 master-0 kubenswrapper[17876]: I0313 10:46:34.680068 17876 generic.go:334] "Generic (PLEG): container finished" podID="d70466f4-da4f-429d-837a-94d1ede9d7ca" containerID="02c76d093810d491b9763a6bfc39b0b542d6aedbe8a647e4a085c8fb283efe50" exitCode=0 Mar 13 10:46:34.680445 master-0 kubenswrapper[17876]: I0313 10:46:34.680182 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f9qr8" event={"ID":"d70466f4-da4f-429d-837a-94d1ede9d7ca","Type":"ContainerDied","Data":"02c76d093810d491b9763a6bfc39b0b542d6aedbe8a647e4a085c8fb283efe50"} Mar 13 10:46:34.685520 master-0 kubenswrapper[17876]: I0313 10:46:34.683509 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" event={"ID":"f3a759b9-0345-408a-a231-def20aeee523","Type":"ContainerStarted","Data":"263525d19e6a5bba839e6e2d400330c736f0355d970348323a2e1819b6a13e66"} Mar 13 10:46:34.685520 master-0 kubenswrapper[17876]: I0313 10:46:34.685132 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"e1aa080d0880cbca606e63c868acc1acc6532bfc624aa0d5694a59abd6faa167"} Mar 13 10:46:34.711881 master-0 kubenswrapper[17876]: I0313 10:46:34.711811 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-66f5cbb768-c7m4c"] Mar 13 10:46:35.693701 master-0 kubenswrapper[17876]: I0313 10:46:35.693645 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"9a18dccabcf8b226a8fc6feafa2b9187f17e57ac8ffa62fec5121339422d8e78"} Mar 13 10:46:35.696817 master-0 kubenswrapper[17876]: I0313 10:46:35.696771 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"d5432fcad1eb8a0beb7cbfd416fe286b6c2aae0cc7137d499776867be45e70da"} Mar 13 10:46:35.701727 master-0 kubenswrapper[17876]: I0313 10:46:35.701696 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f9qr8" event={"ID":"d70466f4-da4f-429d-837a-94d1ede9d7ca","Type":"ContainerStarted","Data":"1ded2ee9382444f0c52ab30e8c2566565d8050650a6a6ccc2fef50e58a279b75"} Mar 13 10:46:35.701848 master-0 kubenswrapper[17876]: I0313 10:46:35.701835 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-f9qr8" event={"ID":"d70466f4-da4f-429d-837a-94d1ede9d7ca","Type":"ContainerStarted","Data":"d62ff1ef9441df5bb51d0f78a5daf022ec2080b81a15b5a580137faa32d9044d"} Mar 13 10:46:35.726203 master-0 kubenswrapper[17876]: I0313 10:46:35.726116 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-sslt8" podStartSLOduration=2.898107493 podStartE2EDuration="4.726073101s" podCreationTimestamp="2026-03-13 10:46:31 +0000 UTC" firstStartedPulling="2026-03-13 10:46:32.436470457 +0000 UTC m=+300.272276953" lastFinishedPulling="2026-03-13 10:46:34.264436085 +0000 UTC m=+302.100242561" observedRunningTime="2026-03-13 10:46:34.732909363 +0000 UTC m=+302.568715839" watchObservedRunningTime="2026-03-13 10:46:35.726073101 +0000 UTC m=+303.561879577" Mar 13 10:46:35.752503 master-0 kubenswrapper[17876]: I0313 10:46:35.752421 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-f9qr8" podStartSLOduration=2.874483629 podStartE2EDuration="4.752402213s" podCreationTimestamp="2026-03-13 10:46:31 +0000 UTC" firstStartedPulling="2026-03-13 10:46:31.804148165 +0000 UTC m=+299.639954641" lastFinishedPulling="2026-03-13 10:46:33.682066749 +0000 UTC m=+301.517873225" observedRunningTime="2026-03-13 10:46:35.751976581 +0000 UTC m=+303.587783057" watchObservedRunningTime="2026-03-13 10:46:35.752402213 +0000 UTC m=+303.588208699" Mar 13 10:46:36.714886 master-0 kubenswrapper[17876]: I0313 10:46:36.712825 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-bc7bfc94d-xftlk"] Mar 13 10:46:36.714886 master-0 kubenswrapper[17876]: I0313 10:46:36.714292 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.717324 master-0 kubenswrapper[17876]: I0313 10:46:36.717255 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bc7bfc94d-xftlk"] Mar 13 10:46:36.717724 master-0 kubenswrapper[17876]: I0313 10:46:36.717694 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 10:46:36.717964 master-0 kubenswrapper[17876]: I0313 10:46:36.717941 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6g1360gbh170n" Mar 13 10:46:36.718272 master-0 kubenswrapper[17876]: I0313 10:46:36.718247 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsbqr" Mar 13 10:46:36.718968 master-0 kubenswrapper[17876]: I0313 10:46:36.718917 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 10:46:36.719778 master-0 kubenswrapper[17876]: I0313 10:46:36.719741 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 10:46:36.720852 master-0 kubenswrapper[17876]: I0313 10:46:36.720820 17876 generic.go:334] "Generic (PLEG): container finished" podID="43106b48-53ec-4da5-9cc3-6e5f10add3ec" containerID="d5432fcad1eb8a0beb7cbfd416fe286b6c2aae0cc7137d499776867be45e70da" exitCode=0 Mar 13 10:46:36.720960 master-0 kubenswrapper[17876]: I0313 10:46:36.720929 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerDied","Data":"d5432fcad1eb8a0beb7cbfd416fe286b6c2aae0cc7137d499776867be45e70da"} Mar 13 10:46:36.723408 master-0 kubenswrapper[17876]: I0313 10:46:36.723338 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.768300 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-metrics-server-audit-profiles\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.768609 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-server-tls\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.768747 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0206180-6adb-4536-bc74-06d1d1b18e37-audit-log\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.768796 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.768846 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-client-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.769008 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-client-certs\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.769307 master-0 kubenswrapper[17876]: I0313 10:46:36.769132 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4mk8\" (UniqueName: \"kubernetes.io/projected/e0206180-6adb-4536-bc74-06d1d1b18e37-kube-api-access-d4mk8\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.869809 master-0 kubenswrapper[17876]: I0313 10:46:36.869730 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-client-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.869809 master-0 kubenswrapper[17876]: I0313 10:46:36.869815 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-client-certs\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.870215 master-0 kubenswrapper[17876]: I0313 10:46:36.869876 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4mk8\" (UniqueName: \"kubernetes.io/projected/e0206180-6adb-4536-bc74-06d1d1b18e37-kube-api-access-d4mk8\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.870215 master-0 kubenswrapper[17876]: I0313 10:46:36.869909 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-metrics-server-audit-profiles\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.870215 master-0 kubenswrapper[17876]: I0313 10:46:36.869945 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-server-tls\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.870215 master-0 kubenswrapper[17876]: I0313 10:46:36.869989 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0206180-6adb-4536-bc74-06d1d1b18e37-audit-log\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.870215 master-0 kubenswrapper[17876]: I0313 10:46:36.870023 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.871796 master-0 kubenswrapper[17876]: I0313 10:46:36.871266 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.871796 master-0 kubenswrapper[17876]: I0313 10:46:36.871738 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/e0206180-6adb-4536-bc74-06d1d1b18e37-audit-log\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.873892 master-0 kubenswrapper[17876]: I0313 10:46:36.873649 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/e0206180-6adb-4536-bc74-06d1d1b18e37-metrics-server-audit-profiles\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.883501 master-0 kubenswrapper[17876]: I0313 10:46:36.878763 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-client-certs\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.883501 master-0 kubenswrapper[17876]: I0313 10:46:36.878801 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-secret-metrics-server-tls\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.886537 master-0 kubenswrapper[17876]: I0313 10:46:36.886499 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4mk8\" (UniqueName: \"kubernetes.io/projected/e0206180-6adb-4536-bc74-06d1d1b18e37-kube-api-access-d4mk8\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:36.887705 master-0 kubenswrapper[17876]: I0313 10:46:36.887670 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e0206180-6adb-4536-bc74-06d1d1b18e37-client-ca-bundle\") pod \"metrics-server-bc7bfc94d-xftlk\" (UID: \"e0206180-6adb-4536-bc74-06d1d1b18e37\") " pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:37.047634 master-0 kubenswrapper[17876]: I0313 10:46:37.047490 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:38.232652 master-0 kubenswrapper[17876]: I0313 10:46:38.232605 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 10:46:38.235759 master-0 kubenswrapper[17876]: I0313 10:46:38.235719 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.238397 master-0 kubenswrapper[17876]: I0313 10:46:38.238349 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 10:46:38.238709 master-0 kubenswrapper[17876]: I0313 10:46:38.238687 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 10:46:38.238833 master-0 kubenswrapper[17876]: I0313 10:46:38.238814 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-fi4otoct6tdgf" Mar 13 10:46:38.238967 master-0 kubenswrapper[17876]: I0313 10:46:38.238948 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 10:46:38.239067 master-0 kubenswrapper[17876]: I0313 10:46:38.239042 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 10:46:38.239159 master-0 kubenswrapper[17876]: I0313 10:46:38.239140 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 10:46:38.239217 master-0 kubenswrapper[17876]: I0313 10:46:38.239060 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 10:46:38.239303 master-0 kubenswrapper[17876]: I0313 10:46:38.239275 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 10:46:38.239355 master-0 kubenswrapper[17876]: I0313 10:46:38.239328 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 10:46:38.239444 master-0 kubenswrapper[17876]: I0313 10:46:38.239294 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 10:46:38.246505 master-0 kubenswrapper[17876]: I0313 10:46:38.245832 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 10:46:38.252599 master-0 kubenswrapper[17876]: I0313 10:46:38.252497 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 10:46:38.258879 master-0 kubenswrapper[17876]: I0313 10:46:38.258051 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.390741 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.390800 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.390823 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.390857 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.390888 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391289 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391331 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391369 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391390 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391423 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391459 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391493 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392003 master-0 kubenswrapper[17876]: I0313 10:46:38.391528 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392883 master-0 kubenswrapper[17876]: I0313 10:46:38.391586 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j49sf\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-kube-api-access-j49sf\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392883 master-0 kubenswrapper[17876]: I0313 10:46:38.392632 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392883 master-0 kubenswrapper[17876]: I0313 10:46:38.392664 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392883 master-0 kubenswrapper[17876]: I0313 10:46:38.392703 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.392883 master-0 kubenswrapper[17876]: I0313 10:46:38.392748 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493753 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493804 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493824 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493846 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493870 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493894 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493917 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493938 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j49sf\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-kube-api-access-j49sf\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493954 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493971 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.493990 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494017 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494044 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494069 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494118 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494145 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494162 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494189 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.494965 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.496256 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.498195 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.499160 master-0 kubenswrapper[17876]: I0313 10:46:38.498763 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.500476 master-0 kubenswrapper[17876]: I0313 10:46:38.499233 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.502010 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config-out\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.502120 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.502401 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.502520 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.504034 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.502865 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.504945 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.505560 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.509483 master-0 kubenswrapper[17876]: I0313 10:46:38.509412 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.512332 master-0 kubenswrapper[17876]: I0313 10:46:38.512266 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-web-config\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.515550 master-0 kubenswrapper[17876]: I0313 10:46:38.514892 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/11ff187f-f0b5-4873-af20-6acbc41fe1f2-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.515550 master-0 kubenswrapper[17876]: I0313 10:46:38.515416 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/11ff187f-f0b5-4873-af20-6acbc41fe1f2-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.517457 master-0 kubenswrapper[17876]: I0313 10:46:38.517397 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j49sf\" (UniqueName: \"kubernetes.io/projected/11ff187f-f0b5-4873-af20-6acbc41fe1f2-kube-api-access-j49sf\") pod \"prometheus-k8s-0\" (UID: \"11ff187f-f0b5-4873-af20-6acbc41fe1f2\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.637014 master-0 kubenswrapper[17876]: I0313 10:46:38.636588 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-bc7bfc94d-xftlk"] Mar 13 10:46:38.707754 master-0 kubenswrapper[17876]: I0313 10:46:38.707707 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:38.764027 master-0 kubenswrapper[17876]: I0313 10:46:38.763969 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"6bf60fdec1a7fe03fde88e1423d7e536fd092a9577f3aac43045198f85cb29ad"} Mar 13 10:46:38.764027 master-0 kubenswrapper[17876]: I0313 10:46:38.764023 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"9251c6a4a400cce38b1e8986e6603150e5c6ed3e43abb7f286bbc85b2acd45fb"} Mar 13 10:46:38.768234 master-0 kubenswrapper[17876]: I0313 10:46:38.765303 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" event={"ID":"e0206180-6adb-4536-bc74-06d1d1b18e37","Type":"ContainerStarted","Data":"42e42106fb647271538f580ff2bc34a21a80d8a92d769e3b76cf53166ebaebbb"} Mar 13 10:46:38.771525 master-0 kubenswrapper[17876]: I0313 10:46:38.768270 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"0f1e21719cf77fe35b737d5c526fde7b0e5ce05ca75fb5c0cc5a9f46e94e788f"} Mar 13 10:46:38.771525 master-0 kubenswrapper[17876]: I0313 10:46:38.768312 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"7799ceb7116b41b895dd0095955ba667a9b1a00e9361237fd661cdc53cea80f6"} Mar 13 10:46:39.234910 master-0 kubenswrapper[17876]: W0313 10:46:39.234823 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11ff187f_f0b5_4873_af20_6acbc41fe1f2.slice/crio-c1d83997bd6fcb6c230c2e9bf0ffb4323a45008567db61b9db44026d768da775 WatchSource:0}: Error finding container c1d83997bd6fcb6c230c2e9bf0ffb4323a45008567db61b9db44026d768da775: Status 404 returned error can't find the container with id c1d83997bd6fcb6c230c2e9bf0ffb4323a45008567db61b9db44026d768da775 Mar 13 10:46:39.238259 master-0 kubenswrapper[17876]: I0313 10:46:39.238202 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 13 10:46:39.779452 master-0 kubenswrapper[17876]: I0313 10:46:39.779353 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"847433842a8857017c370a0db0097d444b81029921abcc93ffdb9e805a928415"} Mar 13 10:46:39.779452 master-0 kubenswrapper[17876]: I0313 10:46:39.779406 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"6603f1b038b266b148bdc0c580e1255dce5651e33844bac85765b3f8663b3a39"} Mar 13 10:46:39.779452 master-0 kubenswrapper[17876]: I0313 10:46:39.779423 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"522bafe690e747d9a072515a96b8d4322f9828b0facfd5e47b75ea23c6a9e50f"} Mar 13 10:46:39.782228 master-0 kubenswrapper[17876]: I0313 10:46:39.782197 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"a01b51d5704cf0e76546f14a6dc0d13303beeadf74b50d6bbeca235f1c4773d8"} Mar 13 10:46:39.785793 master-0 kubenswrapper[17876]: I0313 10:46:39.784080 17876 generic.go:334] "Generic (PLEG): container finished" podID="11ff187f-f0b5-4873-af20-6acbc41fe1f2" containerID="80842d912df55611cc95b6a1b86e78cef05ab797cebc6e0fed4a3a19e35a60cf" exitCode=0 Mar 13 10:46:39.785793 master-0 kubenswrapper[17876]: I0313 10:46:39.784121 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerDied","Data":"80842d912df55611cc95b6a1b86e78cef05ab797cebc6e0fed4a3a19e35a60cf"} Mar 13 10:46:39.785793 master-0 kubenswrapper[17876]: I0313 10:46:39.784135 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"c1d83997bd6fcb6c230c2e9bf0ffb4323a45008567db61b9db44026d768da775"} Mar 13 10:46:40.803520 master-0 kubenswrapper[17876]: I0313 10:46:40.803463 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"40f2ce989027da3095ab0c8f371b30e4d2bb78bbeb213f54a9d08662df93468c"} Mar 13 10:46:40.804013 master-0 kubenswrapper[17876]: I0313 10:46:40.803574 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"6db07ab2bd2b594a7b32b0e3d04952124450d1f51fd90d5b6afa1ce39b92259d"} Mar 13 10:46:40.806481 master-0 kubenswrapper[17876]: I0313 10:46:40.806431 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" event={"ID":"e0206180-6adb-4536-bc74-06d1d1b18e37","Type":"ContainerStarted","Data":"ab5f4db0811c50352f51db7328cbefaf841b8eb715a68c33ca4e5e969d6ca929"} Mar 13 10:46:40.816661 master-0 kubenswrapper[17876]: I0313 10:46:40.816592 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"43106b48-53ec-4da5-9cc3-6e5f10add3ec","Type":"ContainerStarted","Data":"5849eb9613d333169871603bda4ff3955df2b6995fcdfefeca394972986959b4"} Mar 13 10:46:40.836189 master-0 kubenswrapper[17876]: I0313 10:46:40.835710 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" podStartSLOduration=2.976300474 podStartE2EDuration="4.83568986s" podCreationTimestamp="2026-03-13 10:46:36 +0000 UTC" firstStartedPulling="2026-03-13 10:46:38.655032616 +0000 UTC m=+306.490839092" lastFinishedPulling="2026-03-13 10:46:40.514422002 +0000 UTC m=+308.350228478" observedRunningTime="2026-03-13 10:46:40.83008504 +0000 UTC m=+308.665891516" watchObservedRunningTime="2026-03-13 10:46:40.83568986 +0000 UTC m=+308.671496336" Mar 13 10:46:41.830839 master-0 kubenswrapper[17876]: I0313 10:46:41.830704 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" event={"ID":"85999993-ef48-472a-9b16-558097d888f2","Type":"ContainerStarted","Data":"b6cf3ac98337433649e3b4dc6cf4755b766042c9fa0d9eb785935273699e946f"} Mar 13 10:46:41.832145 master-0 kubenswrapper[17876]: I0313 10:46:41.831264 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:41.866551 master-0 kubenswrapper[17876]: I0313 10:46:41.866398 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" podStartSLOduration=3.606304677 podStartE2EDuration="8.866373278s" podCreationTimestamp="2026-03-13 10:46:33 +0000 UTC" firstStartedPulling="2026-03-13 10:46:34.734438466 +0000 UTC m=+302.570244942" lastFinishedPulling="2026-03-13 10:46:39.994507067 +0000 UTC m=+307.830313543" observedRunningTime="2026-03-13 10:46:41.860190042 +0000 UTC m=+309.695996548" watchObservedRunningTime="2026-03-13 10:46:41.866373278 +0000 UTC m=+309.702179754" Mar 13 10:46:41.867673 master-0 kubenswrapper[17876]: I0313 10:46:41.867632 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=4.12240776 podStartE2EDuration="9.867624134s" podCreationTimestamp="2026-03-13 10:46:32 +0000 UTC" firstStartedPulling="2026-03-13 10:46:34.251931989 +0000 UTC m=+302.087738475" lastFinishedPulling="2026-03-13 10:46:39.997148373 +0000 UTC m=+307.832954849" observedRunningTime="2026-03-13 10:46:40.870318748 +0000 UTC m=+308.706125244" watchObservedRunningTime="2026-03-13 10:46:41.867624134 +0000 UTC m=+309.703430610" Mar 13 10:46:42.736133 master-0 kubenswrapper[17876]: I0313 10:46:42.735992 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 10:46:42.736133 master-0 kubenswrapper[17876]: I0313 10:46:42.736069 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: E0313 10:46:42.736333 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: I0313 10:46:42.736345 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: E0313 10:46:42.736376 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: I0313 10:46:42.736382 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: E0313 10:46:42.736392 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 10:46:42.736440 master-0 kubenswrapper[17876]: I0313 10:46:42.736399 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 10:46:42.736667 master-0 kubenswrapper[17876]: I0313 10:46:42.736540 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736667 master-0 kubenswrapper[17876]: I0313 10:46:42.736579 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736667 master-0 kubenswrapper[17876]: I0313 10:46:42.736591 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 13 10:46:42.736774 master-0 kubenswrapper[17876]: E0313 10:46:42.736743 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.736774 master-0 kubenswrapper[17876]: I0313 10:46:42.736756 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.739689 master-0 kubenswrapper[17876]: I0313 10:46:42.736910 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 13 10:46:42.739689 master-0 kubenswrapper[17876]: I0313 10:46:42.737982 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.739689 master-0 kubenswrapper[17876]: I0313 10:46:42.738502 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437" gracePeriod=30 Mar 13 10:46:42.739689 master-0 kubenswrapper[17876]: I0313 10:46:42.738553 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://ff58356cafd17211ab03ac0b3de2df04e88ec6642de92ac89ae8e6565eaf0c07" gracePeriod=30 Mar 13 10:46:42.772347 master-0 kubenswrapper[17876]: I0313 10:46:42.772308 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.772553 master-0 kubenswrapper[17876]: I0313 10:46:42.772532 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.797005 master-0 kubenswrapper[17876]: I0313 10:46:42.796916 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:46:42.874587 master-0 kubenswrapper[17876]: I0313 10:46:42.874508 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.875148 master-0 kubenswrapper[17876]: I0313 10:46:42.874605 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.876197 master-0 kubenswrapper[17876]: I0313 10:46:42.876160 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:42.876264 master-0 kubenswrapper[17876]: I0313 10:46:42.876209 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:43.089783 master-0 kubenswrapper[17876]: I0313 10:46:43.089631 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:43.584744 master-0 kubenswrapper[17876]: W0313 10:46:43.584690 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9fc87edb050c91d1c07246e5eb5386e.slice/crio-e382da152e0f5f97fec633329a69503040939f0b1f872fdf37b7d896caea3669 WatchSource:0}: Error finding container e382da152e0f5f97fec633329a69503040939f0b1f872fdf37b7d896caea3669: Status 404 returned error can't find the container with id e382da152e0f5f97fec633329a69503040939f0b1f872fdf37b7d896caea3669 Mar 13 10:46:43.638833 master-0 kubenswrapper[17876]: I0313 10:46:43.638782 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:46:43.668182 master-0 kubenswrapper[17876]: I0313 10:46:43.668110 17876 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="3e9fd962-c244-4dfc-ad70-c7602d20ce2e" Mar 13 10:46:43.692494 master-0 kubenswrapper[17876]: I0313 10:46:43.692422 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 10:46:43.692710 master-0 kubenswrapper[17876]: I0313 10:46:43.692525 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 10:46:43.692710 master-0 kubenswrapper[17876]: I0313 10:46:43.692539 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:43.692710 master-0 kubenswrapper[17876]: I0313 10:46:43.692584 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 10:46:43.692710 master-0 kubenswrapper[17876]: I0313 10:46:43.692594 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:43.692710 master-0 kubenswrapper[17876]: I0313 10:46:43.692690 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 10:46:43.692884 master-0 kubenswrapper[17876]: I0313 10:46:43.692720 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 13 10:46:43.692884 master-0 kubenswrapper[17876]: I0313 10:46:43.692688 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:43.692884 master-0 kubenswrapper[17876]: I0313 10:46:43.692705 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:43.692970 master-0 kubenswrapper[17876]: I0313 10:46:43.692914 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:43.693055 master-0 kubenswrapper[17876]: I0313 10:46:43.693032 17876 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:43.693055 master-0 kubenswrapper[17876]: I0313 10:46:43.693052 17876 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:43.693169 master-0 kubenswrapper[17876]: I0313 10:46:43.693063 17876 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:43.693169 master-0 kubenswrapper[17876]: I0313 10:46:43.693073 17876 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:43.693169 master-0 kubenswrapper[17876]: I0313 10:46:43.693084 17876 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:43.845511 master-0 kubenswrapper[17876]: I0313 10:46:43.845174 17876 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="ff58356cafd17211ab03ac0b3de2df04e88ec6642de92ac89ae8e6565eaf0c07" exitCode=0 Mar 13 10:46:43.845511 master-0 kubenswrapper[17876]: I0313 10:46:43.845394 17876 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437" exitCode=0 Mar 13 10:46:43.845511 master-0 kubenswrapper[17876]: I0313 10:46:43.845476 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cbbb79cb751f8c371720ce71b1456aaab49e2fc3a536af6b81b2d1430f111a84" Mar 13 10:46:43.845511 master-0 kubenswrapper[17876]: I0313 10:46:43.845517 17876 scope.go:117] "RemoveContainer" containerID="cbd147b01b260c41122b60c0c59b0fada043d48bb6658bed62fc58e0949c3b69" Mar 13 10:46:43.845887 master-0 kubenswrapper[17876]: I0313 10:46:43.845624 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 13 10:46:43.900056 master-0 kubenswrapper[17876]: I0313 10:46:43.865735 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"e382da152e0f5f97fec633329a69503040939f0b1f872fdf37b7d896caea3669"} Mar 13 10:46:43.900056 master-0 kubenswrapper[17876]: I0313 10:46:43.867166 17876 generic.go:334] "Generic (PLEG): container finished" podID="9c685b3c-644d-4253-9fac-6c03fbeed2d5" containerID="a946bd91b2a1464e4bdd327bbf2e60c161f43dc19ba7c51e8dede98c1bc87d04" exitCode=0 Mar 13 10:46:43.900056 master-0 kubenswrapper[17876]: I0313 10:46:43.867193 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"9c685b3c-644d-4253-9fac-6c03fbeed2d5","Type":"ContainerDied","Data":"a946bd91b2a1464e4bdd327bbf2e60c161f43dc19ba7c51e8dede98c1bc87d04"} Mar 13 10:46:44.186215 master-0 kubenswrapper[17876]: I0313 10:46:44.186160 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-66f5cbb768-c7m4c" Mar 13 10:46:44.505645 master-0 kubenswrapper[17876]: I0313 10:46:44.505580 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 13 10:46:44.506011 master-0 kubenswrapper[17876]: I0313 10:46:44.505982 17876 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 13 10:46:44.532210 master-0 kubenswrapper[17876]: I0313 10:46:44.531462 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 10:46:44.532210 master-0 kubenswrapper[17876]: I0313 10:46:44.531563 17876 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="3e9fd962-c244-4dfc-ad70-c7602d20ce2e" Mar 13 10:46:44.541146 master-0 kubenswrapper[17876]: I0313 10:46:44.540362 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 13 10:46:44.541146 master-0 kubenswrapper[17876]: I0313 10:46:44.540406 17876 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="3e9fd962-c244-4dfc-ad70-c7602d20ce2e" Mar 13 10:46:46.208023 master-0 kubenswrapper[17876]: I0313 10:46:46.207945 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"29434e7ada5f1832672838fe58a905107f7dfd9be5974d0d7de66cd0e1cc6389"} Mar 13 10:46:46.208023 master-0 kubenswrapper[17876]: I0313 10:46:46.208012 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"82a0a8a983553e20a2115fc7beabc618916febd7671ba2ea1e092c5d5bf94644"} Mar 13 10:46:46.208023 master-0 kubenswrapper[17876]: I0313 10:46:46.208027 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"2d5db5db82118d882226b834dd31c5bde7cd059e392d7e43489e8dcb2c6322f0"} Mar 13 10:46:46.208592 master-0 kubenswrapper[17876]: I0313 10:46:46.208060 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"1fb788357876b229d730b39629e7557b07a9adbfdddb7a9575d767b8046c5b41"} Mar 13 10:46:46.208592 master-0 kubenswrapper[17876]: I0313 10:46:46.208080 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"cd876c57ad1e558e2f20c0242e9dc4f689c10401c31b88281fe45eb4d0af05e2"} Mar 13 10:46:46.233599 master-0 kubenswrapper[17876]: I0313 10:46:46.233530 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421"} Mar 13 10:46:46.233949 master-0 kubenswrapper[17876]: I0313 10:46:46.233924 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28"} Mar 13 10:46:46.234518 master-0 kubenswrapper[17876]: I0313 10:46:46.234491 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"d671010960870e0da2e9b058c8ed3d53e5393353d4ea9421bce18bd58bb8d5d1"} Mar 13 10:46:46.572294 master-0 kubenswrapper[17876]: I0313 10:46:46.572233 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:46.679843 master-0 kubenswrapper[17876]: I0313 10:46:46.679763 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access\") pod \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " Mar 13 10:46:46.680083 master-0 kubenswrapper[17876]: I0313 10:46:46.679934 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir\") pod \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " Mar 13 10:46:46.680083 master-0 kubenswrapper[17876]: I0313 10:46:46.679970 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock\") pod \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\" (UID: \"9c685b3c-644d-4253-9fac-6c03fbeed2d5\") " Mar 13 10:46:46.680436 master-0 kubenswrapper[17876]: I0313 10:46:46.680402 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock" (OuterVolumeSpecName: "var-lock") pod "9c685b3c-644d-4253-9fac-6c03fbeed2d5" (UID: "9c685b3c-644d-4253-9fac-6c03fbeed2d5"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:46.680586 master-0 kubenswrapper[17876]: I0313 10:46:46.680563 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "9c685b3c-644d-4253-9fac-6c03fbeed2d5" (UID: "9c685b3c-644d-4253-9fac-6c03fbeed2d5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:46:46.683404 master-0 kubenswrapper[17876]: I0313 10:46:46.683375 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "9c685b3c-644d-4253-9fac-6c03fbeed2d5" (UID: "9c685b3c-644d-4253-9fac-6c03fbeed2d5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:46:46.781476 master-0 kubenswrapper[17876]: I0313 10:46:46.781361 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:46.782009 master-0 kubenswrapper[17876]: I0313 10:46:46.781979 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/9c685b3c-644d-4253-9fac-6c03fbeed2d5-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:46.782106 master-0 kubenswrapper[17876]: I0313 10:46:46.782008 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/9c685b3c-644d-4253-9fac-6c03fbeed2d5-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:46:47.241212 master-0 kubenswrapper[17876]: I0313 10:46:47.241155 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" Mar 13 10:46:47.241212 master-0 kubenswrapper[17876]: I0313 10:46:47.241186 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-retry-1-master-0" event={"ID":"9c685b3c-644d-4253-9fac-6c03fbeed2d5","Type":"ContainerDied","Data":"0dc9052d86efd503f76116a18369d7fc173cc0e49357b9b1a65840c8ca34da4d"} Mar 13 10:46:47.241823 master-0 kubenswrapper[17876]: I0313 10:46:47.241249 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dc9052d86efd503f76116a18369d7fc173cc0e49357b9b1a65840c8ca34da4d" Mar 13 10:46:47.245389 master-0 kubenswrapper[17876]: I0313 10:46:47.245348 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"11ff187f-f0b5-4873-af20-6acbc41fe1f2","Type":"ContainerStarted","Data":"c98bf456187fced0729889c1474b4604411d0fac143d77d5986998d0b2c479c3"} Mar 13 10:46:47.248071 master-0 kubenswrapper[17876]: I0313 10:46:47.248038 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190"} Mar 13 10:46:47.288967 master-0 kubenswrapper[17876]: I0313 10:46:47.288861 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=5.487191536 podStartE2EDuration="9.288834861s" podCreationTimestamp="2026-03-13 10:46:38 +0000 UTC" firstStartedPulling="2026-03-13 10:46:39.789305672 +0000 UTC m=+307.625112148" lastFinishedPulling="2026-03-13 10:46:43.590948997 +0000 UTC m=+311.426755473" observedRunningTime="2026-03-13 10:46:47.283295603 +0000 UTC m=+315.119102109" watchObservedRunningTime="2026-03-13 10:46:47.288834861 +0000 UTC m=+315.124641337" Mar 13 10:46:47.312448 master-0 kubenswrapper[17876]: I0313 10:46:47.312351 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=5.312326932 podStartE2EDuration="5.312326932s" podCreationTimestamp="2026-03-13 10:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:46:47.306276179 +0000 UTC m=+315.142082665" watchObservedRunningTime="2026-03-13 10:46:47.312326932 +0000 UTC m=+315.148133408" Mar 13 10:46:48.708876 master-0 kubenswrapper[17876]: I0313 10:46:48.708807 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:46:53.090513 master-0 kubenswrapper[17876]: I0313 10:46:53.090364 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:53.090513 master-0 kubenswrapper[17876]: I0313 10:46:53.090543 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:53.092211 master-0 kubenswrapper[17876]: I0313 10:46:53.090634 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:53.092211 master-0 kubenswrapper[17876]: I0313 10:46:53.090646 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:53.094257 master-0 kubenswrapper[17876]: I0313 10:46:53.094215 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:53.096413 master-0 kubenswrapper[17876]: I0313 10:46:53.096373 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:54.387379 master-0 kubenswrapper[17876]: I0313 10:46:54.387227 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:54.389048 master-0 kubenswrapper[17876]: I0313 10:46:54.388752 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:46:57.047713 master-0 kubenswrapper[17876]: I0313 10:46:57.047630 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:57.048425 master-0 kubenswrapper[17876]: I0313 10:46:57.047728 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:46:57.823196 master-0 kubenswrapper[17876]: I0313 10:46:57.823104 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: I0313 10:46:57.823632 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" containerID="cri-o://76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" gracePeriod=15 Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: I0313 10:46:57.823688 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e" gracePeriod=15 Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: I0313 10:46:57.823704 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1" gracePeriod=15 Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: I0313 10:46:57.823833 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" containerID="cri-o://245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2" gracePeriod=15 Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: E0313 10:46:57.823915 17876 file.go:109] "Unable to process watch event" err="can't process config file \"/etc/kubernetes/manifests/kube-apiserver-pod.yaml\": /etc/kubernetes/manifests/kube-apiserver-pod.yaml: couldn't parse as pod(Object 'Kind' is missing in 'null'), please check config file" Mar 13 10:46:57.824157 master-0 kubenswrapper[17876]: I0313 10:46:57.823708 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852" gracePeriod=15 Mar 13 10:46:57.826409 master-0 kubenswrapper[17876]: I0313 10:46:57.826369 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:46:57.826811 master-0 kubenswrapper[17876]: E0313 10:46:57.826776 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:46:57.826887 master-0 kubenswrapper[17876]: I0313 10:46:57.826816 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:46:57.826887 master-0 kubenswrapper[17876]: E0313 10:46:57.826851 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.826887 master-0 kubenswrapper[17876]: I0313 10:46:57.826863 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.826887 master-0 kubenswrapper[17876]: E0313 10:46:57.826880 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c685b3c-644d-4253-9fac-6c03fbeed2d5" containerName="installer" Mar 13 10:46:57.826887 master-0 kubenswrapper[17876]: I0313 10:46:57.826890 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c685b3c-644d-4253-9fac-6c03fbeed2d5" containerName="installer" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: E0313 10:46:57.826905 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: I0313 10:46:57.826912 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: E0313 10:46:57.826931 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: I0313 10:46:57.826938 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="setup" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: E0313 10:46:57.826952 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: I0313 10:46:57.826960 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: E0313 10:46:57.826975 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: I0313 10:46:57.826983 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: E0313 10:46:57.826999 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 10:46:57.827134 master-0 kubenswrapper[17876]: I0313 10:46:57.827006 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827320 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-syncer" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827356 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827385 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-check-endpoints" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827404 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827416 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827427 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c685b3c-644d-4253-9fac-6c03fbeed2d5" containerName="installer" Mar 13 10:46:57.827467 master-0 kubenswrapper[17876]: I0313 10:46:57.827441 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="077dd10388b9e3e48a07382126e86621" containerName="kube-apiserver-insecure-readyz" Mar 13 10:46:57.830854 master-0 kubenswrapper[17876]: I0313 10:46:57.830801 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:46:57.831805 master-0 kubenswrapper[17876]: I0313 10:46:57.831637 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.840170 master-0 kubenswrapper[17876]: I0313 10:46:57.839735 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="077dd10388b9e3e48a07382126e86621" podUID="48512e02022680c9d90092634f0fc146" Mar 13 10:46:57.942044 master-0 kubenswrapper[17876]: I0313 10:46:57.941991 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:57.942044 master-0 kubenswrapper[17876]: I0313 10:46:57.942046 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:57.942213 master-0 kubenswrapper[17876]: I0313 10:46:57.942065 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.942213 master-0 kubenswrapper[17876]: I0313 10:46:57.942117 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:57.942213 master-0 kubenswrapper[17876]: I0313 10:46:57.942152 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.942213 master-0 kubenswrapper[17876]: I0313 10:46:57.942178 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.942213 master-0 kubenswrapper[17876]: I0313 10:46:57.942192 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.942390 master-0 kubenswrapper[17876]: I0313 10:46:57.942228 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:57.959174 master-0 kubenswrapper[17876]: E0313 10:46:57.958621 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044135 master-0 kubenswrapper[17876]: I0313 10:46:58.044049 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044135 master-0 kubenswrapper[17876]: I0313 10:46:58.044096 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044135 master-0 kubenswrapper[17876]: I0313 10:46:58.044130 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044157 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044185 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044208 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044221 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044248 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044521 master-0 kubenswrapper[17876]: I0313 10:46:58.044350 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044528 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044554 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044576 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044595 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044615 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044635 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.044828 master-0 kubenswrapper[17876]: I0313 10:46:58.044656 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.260215 master-0 kubenswrapper[17876]: I0313 10:46:58.259803 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:58.295555 master-0 kubenswrapper[17876]: W0313 10:46:58.295474 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a18cac8a90d6913a6a0391d805cddc9.slice/crio-e25d1a564a5f47ecd81a7438125da176e39b0c45fc95630f7ef41fd3d3fc80a3 WatchSource:0}: Error finding container e25d1a564a5f47ecd81a7438125da176e39b0c45fc95630f7ef41fd3d3fc80a3: Status 404 returned error can't find the container with id e25d1a564a5f47ecd81a7438125da176e39b0c45fc95630f7ef41fd3d3fc80a3 Mar 13 10:46:58.299762 master-0 kubenswrapper[17876]: E0313 10:46:58.299606 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c60d451cbcd22 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:3a18cac8a90d6913a6a0391d805cddc9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:46:58.298776866 +0000 UTC m=+326.134583342,LastTimestamp:2026-03-13 10:46:58.298776866 +0000 UTC m=+326.134583342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:46:58.332042 master-0 kubenswrapper[17876]: I0313 10:46:58.331969 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"3a18cac8a90d6913a6a0391d805cddc9","Type":"ContainerStarted","Data":"e25d1a564a5f47ecd81a7438125da176e39b0c45fc95630f7ef41fd3d3fc80a3"} Mar 13 10:46:58.334900 master-0 kubenswrapper[17876]: I0313 10:46:58.334868 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-check-endpoints/0.log" Mar 13 10:46:58.336219 master-0 kubenswrapper[17876]: I0313 10:46:58.336186 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 10:46:58.336838 master-0 kubenswrapper[17876]: I0313 10:46:58.336795 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e" exitCode=0 Mar 13 10:46:58.336838 master-0 kubenswrapper[17876]: I0313 10:46:58.336825 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852" exitCode=0 Mar 13 10:46:58.336838 master-0 kubenswrapper[17876]: I0313 10:46:58.336834 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1" exitCode=0 Mar 13 10:46:58.336995 master-0 kubenswrapper[17876]: I0313 10:46:58.336843 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2" exitCode=2 Mar 13 10:46:58.336995 master-0 kubenswrapper[17876]: I0313 10:46:58.336903 17876 scope.go:117] "RemoveContainer" containerID="8e8f8f57a168775c200793e552fc98f6fa3129d85e0ede2ff0d1df9451ff0848" Mar 13 10:46:58.339209 master-0 kubenswrapper[17876]: I0313 10:46:58.339169 17876 generic.go:334] "Generic (PLEG): container finished" podID="f57bfc81-1c24-4b56-be43-08a173a82b76" containerID="cb729ad0e1626dd8b0150006e31be5ecd648bdaf7e7a26953eb61e56168cbdf3" exitCode=0 Mar 13 10:46:58.339209 master-0 kubenswrapper[17876]: I0313 10:46:58.339203 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f57bfc81-1c24-4b56-be43-08a173a82b76","Type":"ContainerDied","Data":"cb729ad0e1626dd8b0150006e31be5ecd648bdaf7e7a26953eb61e56168cbdf3"} Mar 13 10:46:58.340381 master-0 kubenswrapper[17876]: I0313 10:46:58.340315 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:46:59.350119 master-0 kubenswrapper[17876]: I0313 10:46:59.350041 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"3a18cac8a90d6913a6a0391d805cddc9","Type":"ContainerStarted","Data":"b3eed89c50140b243696a6b00e9f42b9a2506d5578bc99cda4095ffe7165758d"} Mar 13 10:46:59.351185 master-0 kubenswrapper[17876]: E0313 10:46:59.351131 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:46:59.351185 master-0 kubenswrapper[17876]: I0313 10:46:59.351152 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:46:59.353845 master-0 kubenswrapper[17876]: I0313 10:46:59.353809 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 10:46:59.774817 master-0 kubenswrapper[17876]: I0313 10:46:59.774735 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:46:59.775785 master-0 kubenswrapper[17876]: I0313 10:46:59.775723 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:46:59.920801 master-0 kubenswrapper[17876]: I0313 10:46:59.919344 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access\") pod \"f57bfc81-1c24-4b56-be43-08a173a82b76\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " Mar 13 10:46:59.952156 master-0 kubenswrapper[17876]: I0313 10:46:59.951215 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir\") pod \"f57bfc81-1c24-4b56-be43-08a173a82b76\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " Mar 13 10:46:59.952156 master-0 kubenswrapper[17876]: I0313 10:46:59.951356 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock\") pod \"f57bfc81-1c24-4b56-be43-08a173a82b76\" (UID: \"f57bfc81-1c24-4b56-be43-08a173a82b76\") " Mar 13 10:46:59.956479 master-0 kubenswrapper[17876]: I0313 10:46:59.954638 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "f57bfc81-1c24-4b56-be43-08a173a82b76" (UID: "f57bfc81-1c24-4b56-be43-08a173a82b76"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:46:59.957073 master-0 kubenswrapper[17876]: I0313 10:46:59.957014 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "f57bfc81-1c24-4b56-be43-08a173a82b76" (UID: "f57bfc81-1c24-4b56-be43-08a173a82b76"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:00.052994 master-0 kubenswrapper[17876]: I0313 10:47:00.050822 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock" (OuterVolumeSpecName: "var-lock") pod "f57bfc81-1c24-4b56-be43-08a173a82b76" (UID: "f57bfc81-1c24-4b56-be43-08a173a82b76"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:00.055334 master-0 kubenswrapper[17876]: I0313 10:47:00.054963 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f57bfc81-1c24-4b56-be43-08a173a82b76-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.055334 master-0 kubenswrapper[17876]: I0313 10:47:00.055064 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.156526 master-0 kubenswrapper[17876]: I0313 10:47:00.156443 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f57bfc81-1c24-4b56-be43-08a173a82b76-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.297428 master-0 kubenswrapper[17876]: I0313 10:47:00.297382 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 10:47:00.298470 master-0 kubenswrapper[17876]: I0313 10:47:00.298437 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:00.299435 master-0 kubenswrapper[17876]: I0313 10:47:00.299385 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.300060 master-0 kubenswrapper[17876]: I0313 10:47:00.300027 17876 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.358990 master-0 kubenswrapper[17876]: I0313 10:47:00.358936 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 10:47:00.358990 master-0 kubenswrapper[17876]: I0313 10:47:00.359013 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 10:47:00.359671 master-0 kubenswrapper[17876]: I0313 10:47:00.359088 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") pod \"077dd10388b9e3e48a07382126e86621\" (UID: \"077dd10388b9e3e48a07382126e86621\") " Mar 13 10:47:00.359671 master-0 kubenswrapper[17876]: I0313 10:47:00.359627 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:00.359671 master-0 kubenswrapper[17876]: I0313 10:47:00.359667 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:00.359796 master-0 kubenswrapper[17876]: I0313 10:47:00.359703 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "077dd10388b9e3e48a07382126e86621" (UID: "077dd10388b9e3e48a07382126e86621"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:00.364497 master-0 kubenswrapper[17876]: I0313 10:47:00.364450 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_077dd10388b9e3e48a07382126e86621/kube-apiserver-cert-syncer/0.log" Mar 13 10:47:00.365151 master-0 kubenswrapper[17876]: I0313 10:47:00.365119 17876 generic.go:334] "Generic (PLEG): container finished" podID="077dd10388b9e3e48a07382126e86621" containerID="76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" exitCode=0 Mar 13 10:47:00.365268 master-0 kubenswrapper[17876]: I0313 10:47:00.365197 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:00.365499 master-0 kubenswrapper[17876]: I0313 10:47:00.365259 17876 scope.go:117] "RemoveContainer" containerID="7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e" Mar 13 10:47:00.367423 master-0 kubenswrapper[17876]: I0313 10:47:00.367376 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"f57bfc81-1c24-4b56-be43-08a173a82b76","Type":"ContainerDied","Data":"8b7339795fbc7798a7e32d45b2db1baa834187107e63224c4e430fc52fcca69e"} Mar 13 10:47:00.367423 master-0 kubenswrapper[17876]: I0313 10:47:00.367413 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 13 10:47:00.367528 master-0 kubenswrapper[17876]: I0313 10:47:00.367437 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b7339795fbc7798a7e32d45b2db1baa834187107e63224c4e430fc52fcca69e" Mar 13 10:47:00.368743 master-0 kubenswrapper[17876]: E0313 10:47:00.368653 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:47:00.379672 master-0 kubenswrapper[17876]: I0313 10:47:00.379630 17876 scope.go:117] "RemoveContainer" containerID="661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852" Mar 13 10:47:00.392528 master-0 kubenswrapper[17876]: I0313 10:47:00.392440 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.393433 master-0 kubenswrapper[17876]: I0313 10:47:00.393391 17876 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.395484 master-0 kubenswrapper[17876]: I0313 10:47:00.395452 17876 status_manager.go:851] "Failed to get status for pod" podUID="077dd10388b9e3e48a07382126e86621" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.396529 master-0 kubenswrapper[17876]: I0313 10:47:00.396441 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:00.432441 master-0 kubenswrapper[17876]: I0313 10:47:00.432407 17876 scope.go:117] "RemoveContainer" containerID="994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1" Mar 13 10:47:00.455828 master-0 kubenswrapper[17876]: I0313 10:47:00.455668 17876 scope.go:117] "RemoveContainer" containerID="245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2" Mar 13 10:47:00.460296 master-0 kubenswrapper[17876]: I0313 10:47:00.460258 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.460400 master-0 kubenswrapper[17876]: I0313 10:47:00.460302 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.460400 master-0 kubenswrapper[17876]: I0313 10:47:00.460321 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/077dd10388b9e3e48a07382126e86621-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:00.469977 master-0 kubenswrapper[17876]: I0313 10:47:00.469959 17876 scope.go:117] "RemoveContainer" containerID="76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" Mar 13 10:47:00.497379 master-0 kubenswrapper[17876]: I0313 10:47:00.497350 17876 scope.go:117] "RemoveContainer" containerID="4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29" Mar 13 10:47:00.506656 master-0 kubenswrapper[17876]: I0313 10:47:00.506603 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="077dd10388b9e3e48a07382126e86621" path="/var/lib/kubelet/pods/077dd10388b9e3e48a07382126e86621/volumes" Mar 13 10:47:00.517429 master-0 kubenswrapper[17876]: I0313 10:47:00.517348 17876 scope.go:117] "RemoveContainer" containerID="7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e" Mar 13 10:47:00.518485 master-0 kubenswrapper[17876]: E0313 10:47:00.518446 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e\": container with ID starting with 7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e not found: ID does not exist" containerID="7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e" Mar 13 10:47:00.518536 master-0 kubenswrapper[17876]: I0313 10:47:00.518500 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e"} err="failed to get container status \"7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e\": rpc error: code = NotFound desc = could not find container \"7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e\": container with ID starting with 7a6c8c7097829ab34ec3d11712561482468cf4057f48929c4d9bfaf8a89d721e not found: ID does not exist" Mar 13 10:47:00.518536 master-0 kubenswrapper[17876]: I0313 10:47:00.518532 17876 scope.go:117] "RemoveContainer" containerID="661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852" Mar 13 10:47:00.518883 master-0 kubenswrapper[17876]: E0313 10:47:00.518847 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852\": container with ID starting with 661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852 not found: ID does not exist" containerID="661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852" Mar 13 10:47:00.518941 master-0 kubenswrapper[17876]: I0313 10:47:00.518896 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852"} err="failed to get container status \"661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852\": rpc error: code = NotFound desc = could not find container \"661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852\": container with ID starting with 661e31f5d40e07453dec5b5715ae4f2ac20f2bf77a5c093b6c86123371e19852 not found: ID does not exist" Mar 13 10:47:00.518941 master-0 kubenswrapper[17876]: I0313 10:47:00.518926 17876 scope.go:117] "RemoveContainer" containerID="994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1" Mar 13 10:47:00.519389 master-0 kubenswrapper[17876]: E0313 10:47:00.519356 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1\": container with ID starting with 994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1 not found: ID does not exist" containerID="994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1" Mar 13 10:47:00.519455 master-0 kubenswrapper[17876]: I0313 10:47:00.519383 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1"} err="failed to get container status \"994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1\": rpc error: code = NotFound desc = could not find container \"994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1\": container with ID starting with 994f48932a655f1602e8a064ee4feaa3571da054e5fc27af291f90c9737f99e1 not found: ID does not exist" Mar 13 10:47:00.519455 master-0 kubenswrapper[17876]: I0313 10:47:00.519399 17876 scope.go:117] "RemoveContainer" containerID="245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2" Mar 13 10:47:00.519880 master-0 kubenswrapper[17876]: E0313 10:47:00.519852 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2\": container with ID starting with 245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2 not found: ID does not exist" containerID="245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2" Mar 13 10:47:00.519943 master-0 kubenswrapper[17876]: I0313 10:47:00.519902 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2"} err="failed to get container status \"245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2\": rpc error: code = NotFound desc = could not find container \"245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2\": container with ID starting with 245616dbcb6c10c11506dd5bf50d7218750025f50a64d087e0418a61098fc1b2 not found: ID does not exist" Mar 13 10:47:00.519943 master-0 kubenswrapper[17876]: I0313 10:47:00.519925 17876 scope.go:117] "RemoveContainer" containerID="76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" Mar 13 10:47:00.520433 master-0 kubenswrapper[17876]: E0313 10:47:00.520409 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38\": container with ID starting with 76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38 not found: ID does not exist" containerID="76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38" Mar 13 10:47:00.520478 master-0 kubenswrapper[17876]: I0313 10:47:00.520442 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38"} err="failed to get container status \"76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38\": rpc error: code = NotFound desc = could not find container \"76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38\": container with ID starting with 76a33e08fe4a7d85b7bb9012f1aa4ae599d27b8c6ba621ef565cdec674a44e38 not found: ID does not exist" Mar 13 10:47:00.520478 master-0 kubenswrapper[17876]: I0313 10:47:00.520460 17876 scope.go:117] "RemoveContainer" containerID="4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29" Mar 13 10:47:00.520801 master-0 kubenswrapper[17876]: E0313 10:47:00.520768 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29\": container with ID starting with 4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29 not found: ID does not exist" containerID="4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29" Mar 13 10:47:00.520855 master-0 kubenswrapper[17876]: I0313 10:47:00.520806 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29"} err="failed to get container status \"4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29\": rpc error: code = NotFound desc = could not find container \"4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29\": container with ID starting with 4a7704ea8b69b48404133ac613de2aef8b353207f2ae732bf003f8a2eb848a29 not found: ID does not exist" Mar 13 10:47:02.499861 master-0 kubenswrapper[17876]: I0313 10:47:02.499778 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:04.401624 master-0 kubenswrapper[17876]: I0313 10:47:04.401574 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_fd72828c-ef4f-4cfb-88ea-5e7c7d45c960/installer/0.log" Mar 13 10:47:04.402315 master-0 kubenswrapper[17876]: I0313 10:47:04.401666 17876 generic.go:334] "Generic (PLEG): container finished" podID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" containerID="24c84f241d3e7c5bd250d1c21d977eb70d1ed2e63b98759512919e0541c9e2e3" exitCode=1 Mar 13 10:47:04.402315 master-0 kubenswrapper[17876]: I0313 10:47:04.401701 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960","Type":"ContainerDied","Data":"24c84f241d3e7c5bd250d1c21d977eb70d1ed2e63b98759512919e0541c9e2e3"} Mar 13 10:47:04.402569 master-0 kubenswrapper[17876]: I0313 10:47:04.402532 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:04.403401 master-0 kubenswrapper[17876]: I0313 10:47:04.403360 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.073805 master-0 kubenswrapper[17876]: E0313 10:47:05.073691 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.074719 master-0 kubenswrapper[17876]: E0313 10:47:05.074622 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.075726 master-0 kubenswrapper[17876]: E0313 10:47:05.075653 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.076566 master-0 kubenswrapper[17876]: E0313 10:47:05.076509 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.077437 master-0 kubenswrapper[17876]: E0313 10:47:05.077367 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.077517 master-0 kubenswrapper[17876]: I0313 10:47:05.077466 17876 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 10:47:05.078290 master-0 kubenswrapper[17876]: E0313 10:47:05.078198 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 10:47:05.279698 master-0 kubenswrapper[17876]: E0313 10:47:05.279609 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 10:47:05.681938 master-0 kubenswrapper[17876]: E0313 10:47:05.681661 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 10:47:05.810972 master-0 kubenswrapper[17876]: I0313 10:47:05.810932 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_fd72828c-ef4f-4cfb-88ea-5e7c7d45c960/installer/0.log" Mar 13 10:47:05.811205 master-0 kubenswrapper[17876]: I0313 10:47:05.811000 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 10:47:05.812020 master-0 kubenswrapper[17876]: I0313 10:47:05.811974 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.812512 master-0 kubenswrapper[17876]: I0313 10:47:05.812477 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:05.976195 master-0 kubenswrapper[17876]: I0313 10:47:05.976025 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock\") pod \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " Mar 13 10:47:05.976195 master-0 kubenswrapper[17876]: I0313 10:47:05.976139 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access\") pod \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " Mar 13 10:47:05.976195 master-0 kubenswrapper[17876]: I0313 10:47:05.976171 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir\") pod \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\" (UID: \"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960\") " Mar 13 10:47:05.976480 master-0 kubenswrapper[17876]: I0313 10:47:05.976341 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock" (OuterVolumeSpecName: "var-lock") pod "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" (UID: "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:05.976524 master-0 kubenswrapper[17876]: I0313 10:47:05.976460 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" (UID: "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:05.976898 master-0 kubenswrapper[17876]: I0313 10:47:05.976852 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:05.976964 master-0 kubenswrapper[17876]: I0313 10:47:05.976893 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:05.981908 master-0 kubenswrapper[17876]: I0313 10:47:05.981455 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" (UID: "fd72828c-ef4f-4cfb-88ea-5e7c7d45c960"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:47:06.078264 master-0 kubenswrapper[17876]: I0313 10:47:06.078167 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fd72828c-ef4f-4cfb-88ea-5e7c7d45c960-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:06.201596 master-0 kubenswrapper[17876]: E0313 10:47:06.201421 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c60d451cbcd22 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:3a18cac8a90d6913a6a0391d805cddc9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:46:58.298776866 +0000 UTC m=+326.134583342,LastTimestamp:2026-03-13 10:46:58.298776866 +0000 UTC m=+326.134583342,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:47:06.416644 master-0 kubenswrapper[17876]: I0313 10:47:06.416587 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-master-0_fd72828c-ef4f-4cfb-88ea-5e7c7d45c960/installer/0.log" Mar 13 10:47:06.416909 master-0 kubenswrapper[17876]: I0313 10:47:06.416662 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"fd72828c-ef4f-4cfb-88ea-5e7c7d45c960","Type":"ContainerDied","Data":"30f03912a2a587ba44133889b8b9353c7c43c3787cc5614c5915702674574e15"} Mar 13 10:47:06.416909 master-0 kubenswrapper[17876]: I0313 10:47:06.416691 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="30f03912a2a587ba44133889b8b9353c7c43c3787cc5614c5915702674574e15" Mar 13 10:47:06.416909 master-0 kubenswrapper[17876]: I0313 10:47:06.416761 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 13 10:47:06.433930 master-0 kubenswrapper[17876]: I0313 10:47:06.433844 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:06.434571 master-0 kubenswrapper[17876]: I0313 10:47:06.434530 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:06.482937 master-0 kubenswrapper[17876]: E0313 10:47:06.482824 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 10:47:08.086377 master-0 kubenswrapper[17876]: E0313 10:47:08.086219 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 10:47:11.287876 master-0 kubenswrapper[17876]: E0313 10:47:11.287766 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 13 10:47:11.494187 master-0 kubenswrapper[17876]: I0313 10:47:11.494042 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:11.496515 master-0 kubenswrapper[17876]: I0313 10:47:11.496438 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:11.497306 master-0 kubenswrapper[17876]: I0313 10:47:11.497231 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:11.517875 master-0 kubenswrapper[17876]: I0313 10:47:11.517801 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:11.518081 master-0 kubenswrapper[17876]: I0313 10:47:11.517893 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:11.519255 master-0 kubenswrapper[17876]: E0313 10:47:11.519195 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:11.520049 master-0 kubenswrapper[17876]: I0313 10:47:11.519993 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:12.470873 master-0 kubenswrapper[17876]: I0313 10:47:12.470779 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc" exitCode=0 Mar 13 10:47:12.471541 master-0 kubenswrapper[17876]: I0313 10:47:12.470875 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerDied","Data":"ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc"} Mar 13 10:47:12.471541 master-0 kubenswrapper[17876]: I0313 10:47:12.470949 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"65dc5df30f5bcbea123483cb4df080cfdf2e8c9dfa03543f4432e88a6448a4b7"} Mar 13 10:47:12.471541 master-0 kubenswrapper[17876]: I0313 10:47:12.471459 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:12.471541 master-0 kubenswrapper[17876]: I0313 10:47:12.471492 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:12.472436 master-0 kubenswrapper[17876]: I0313 10:47:12.472377 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:12.472530 master-0 kubenswrapper[17876]: E0313 10:47:12.472377 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:12.473376 master-0 kubenswrapper[17876]: I0313 10:47:12.473304 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:12.502518 master-0 kubenswrapper[17876]: I0313 10:47:12.502423 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:12.503212 master-0 kubenswrapper[17876]: I0313 10:47:12.503144 17876 status_manager.go:851] "Failed to get status for pod" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" pod="openshift-kube-apiserver/installer-4-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:12.503753 master-0 kubenswrapper[17876]: I0313 10:47:12.503709 17876 status_manager.go:851] "Failed to get status for pod" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" pod="openshift-etcd/installer-2-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:47:13.484726 master-0 kubenswrapper[17876]: I0313 10:47:13.484667 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a"} Mar 13 10:47:13.484726 master-0 kubenswrapper[17876]: I0313 10:47:13.484713 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab"} Mar 13 10:47:13.484726 master-0 kubenswrapper[17876]: I0313 10:47:13.484723 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77"} Mar 13 10:47:13.484726 master-0 kubenswrapper[17876]: I0313 10:47:13.484731 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f"} Mar 13 10:47:14.504851 master-0 kubenswrapper[17876]: I0313 10:47:14.504796 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:14.504851 master-0 kubenswrapper[17876]: I0313 10:47:14.504836 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:14.505560 master-0 kubenswrapper[17876]: I0313 10:47:14.505179 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:14.505560 master-0 kubenswrapper[17876]: I0313 10:47:14.505219 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda"} Mar 13 10:47:16.523303 master-0 kubenswrapper[17876]: I0313 10:47:16.523152 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:16.523303 master-0 kubenswrapper[17876]: I0313 10:47:16.523289 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:16.530330 master-0 kubenswrapper[17876]: I0313 10:47:16.530264 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:17.054000 master-0 kubenswrapper[17876]: I0313 10:47:17.053922 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:47:17.058772 master-0 kubenswrapper[17876]: I0313 10:47:17.058695 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-bc7bfc94d-xftlk" Mar 13 10:47:19.527901 master-0 kubenswrapper[17876]: I0313 10:47:19.527834 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:19.603118 master-0 kubenswrapper[17876]: I0313 10:47:19.595632 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="48512e02022680c9d90092634f0fc146" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:47:19.749066 master-0 kubenswrapper[17876]: I0313 10:47:19.747930 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:19.749066 master-0 kubenswrapper[17876]: I0313 10:47:19.747983 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:19.751314 master-0 kubenswrapper[17876]: I0313 10:47:19.751267 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="48512e02022680c9d90092634f0fc146" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:47:19.751620 master-0 kubenswrapper[17876]: I0313 10:47:19.751587 17876 status_manager.go:308] "Container readiness changed before pod has synced" pod="openshift-kube-apiserver/kube-apiserver-master-0" containerID="cri-o://342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" Mar 13 10:47:19.751620 master-0 kubenswrapper[17876]: I0313 10:47:19.751618 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:20.754491 master-0 kubenswrapper[17876]: I0313 10:47:20.753981 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:20.755332 master-0 kubenswrapper[17876]: I0313 10:47:20.755213 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="e46ba118-be6f-4d13-a663-c91d541478cc" Mar 13 10:47:20.757301 master-0 kubenswrapper[17876]: I0313 10:47:20.757259 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="48512e02022680c9d90092634f0fc146" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:47:28.309174 master-0 kubenswrapper[17876]: I0313 10:47:28.309029 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:47:28.605212 master-0 kubenswrapper[17876]: I0313 10:47:28.605028 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 10:47:28.707955 master-0 kubenswrapper[17876]: I0313 10:47:28.707875 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-jckzr" Mar 13 10:47:28.770864 master-0 kubenswrapper[17876]: I0313 10:47:28.770791 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 10:47:29.176849 master-0 kubenswrapper[17876]: I0313 10:47:29.176733 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 10:47:29.194674 master-0 kubenswrapper[17876]: I0313 10:47:29.194600 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 10:47:29.464116 master-0 kubenswrapper[17876]: I0313 10:47:29.463950 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:47:29.786923 master-0 kubenswrapper[17876]: I0313 10:47:29.786803 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 10:47:30.120044 master-0 kubenswrapper[17876]: I0313 10:47:30.119901 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 10:47:30.318892 master-0 kubenswrapper[17876]: I0313 10:47:30.318808 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 10:47:30.329306 master-0 kubenswrapper[17876]: I0313 10:47:30.329261 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-5pbvv" Mar 13 10:47:30.457941 master-0 kubenswrapper[17876]: I0313 10:47:30.457882 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:47:30.477449 master-0 kubenswrapper[17876]: I0313 10:47:30.477394 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:47:30.941877 master-0 kubenswrapper[17876]: I0313 10:47:30.941834 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 10:47:31.008281 master-0 kubenswrapper[17876]: I0313 10:47:31.008224 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 10:47:31.091490 master-0 kubenswrapper[17876]: I0313 10:47:31.091415 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 10:47:31.173345 master-0 kubenswrapper[17876]: I0313 10:47:31.173044 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 10:47:31.178268 master-0 kubenswrapper[17876]: I0313 10:47:31.178213 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:47:31.203302 master-0 kubenswrapper[17876]: I0313 10:47:31.203134 17876 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:47:31.211244 master-0 kubenswrapper[17876]: I0313 10:47:31.211177 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:47:31.211244 master-0 kubenswrapper[17876]: I0313 10:47:31.211257 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:47:31.221404 master-0 kubenswrapper[17876]: I0313 10:47:31.221354 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:47:31.234858 master-0 kubenswrapper[17876]: I0313 10:47:31.234817 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 10:47:31.239399 master-0 kubenswrapper[17876]: I0313 10:47:31.239324 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=12.239280653 podStartE2EDuration="12.239280653s" podCreationTimestamp="2026-03-13 10:47:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:47:31.236402521 +0000 UTC m=+359.072209027" watchObservedRunningTime="2026-03-13 10:47:31.239280653 +0000 UTC m=+359.075087129" Mar 13 10:47:31.452063 master-0 kubenswrapper[17876]: I0313 10:47:31.451989 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:47:31.643117 master-0 kubenswrapper[17876]: I0313 10:47:31.642162 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 10:47:31.658299 master-0 kubenswrapper[17876]: I0313 10:47:31.658235 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-c9gmn" Mar 13 10:47:31.729072 master-0 kubenswrapper[17876]: I0313 10:47:31.729010 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:47:31.765715 master-0 kubenswrapper[17876]: I0313 10:47:31.765658 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:47:31.823995 master-0 kubenswrapper[17876]: I0313 10:47:31.823928 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 10:47:31.918946 master-0 kubenswrapper[17876]: I0313 10:47:31.918821 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 10:47:31.983586 master-0 kubenswrapper[17876]: I0313 10:47:31.983528 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 10:47:32.022876 master-0 kubenswrapper[17876]: I0313 10:47:32.022813 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:47:32.205421 master-0 kubenswrapper[17876]: I0313 10:47:32.205176 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 10:47:32.249529 master-0 kubenswrapper[17876]: I0313 10:47:32.249466 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:47:32.309292 master-0 kubenswrapper[17876]: I0313 10:47:32.309211 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 10:47:32.332751 master-0 kubenswrapper[17876]: I0313 10:47:32.332676 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:47:32.402275 master-0 kubenswrapper[17876]: I0313 10:47:32.402185 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-qddwq" Mar 13 10:47:32.423339 master-0 kubenswrapper[17876]: I0313 10:47:32.423271 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:47:32.459343 master-0 kubenswrapper[17876]: I0313 10:47:32.459203 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:47:32.547337 master-0 kubenswrapper[17876]: I0313 10:47:32.547272 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 10:47:32.582158 master-0 kubenswrapper[17876]: I0313 10:47:32.582085 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:47:32.632163 master-0 kubenswrapper[17876]: I0313 10:47:32.632124 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 10:47:32.726280 master-0 kubenswrapper[17876]: I0313 10:47:32.726160 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:47:32.757241 master-0 kubenswrapper[17876]: I0313 10:47:32.757187 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:47:32.913434 master-0 kubenswrapper[17876]: I0313 10:47:32.913387 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-ff7d6" Mar 13 10:47:32.916438 master-0 kubenswrapper[17876]: I0313 10:47:32.916380 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 10:47:32.936435 master-0 kubenswrapper[17876]: I0313 10:47:32.936285 17876 scope.go:117] "RemoveContainer" containerID="d4c4f345608352771d181c87ae83f87748ecbf6ccdee52cebdd330e421648437" Mar 13 10:47:32.945556 master-0 kubenswrapper[17876]: I0313 10:47:32.945512 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 10:47:33.137203 master-0 kubenswrapper[17876]: I0313 10:47:33.137137 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 10:47:33.217926 master-0 kubenswrapper[17876]: I0313 10:47:33.217861 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 10:47:33.243405 master-0 kubenswrapper[17876]: I0313 10:47:33.243329 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 10:47:33.245495 master-0 kubenswrapper[17876]: I0313 10:47:33.245415 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 10:47:33.267691 master-0 kubenswrapper[17876]: I0313 10:47:33.267625 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:47:33.279597 master-0 kubenswrapper[17876]: I0313 10:47:33.279548 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 10:47:33.287295 master-0 kubenswrapper[17876]: I0313 10:47:33.287252 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:47:33.377158 master-0 kubenswrapper[17876]: I0313 10:47:33.377080 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:47:33.551873 master-0 kubenswrapper[17876]: I0313 10:47:33.551749 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:47:33.742055 master-0 kubenswrapper[17876]: I0313 10:47:33.741978 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 10:47:33.844592 master-0 kubenswrapper[17876]: I0313 10:47:33.844474 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2p4lb" Mar 13 10:47:33.894060 master-0 kubenswrapper[17876]: I0313 10:47:33.893951 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:47:34.071374 master-0 kubenswrapper[17876]: I0313 10:47:34.071285 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:47:34.078824 master-0 kubenswrapper[17876]: I0313 10:47:34.078762 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 10:47:34.151377 master-0 kubenswrapper[17876]: I0313 10:47:34.151308 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 10:47:34.164273 master-0 kubenswrapper[17876]: I0313 10:47:34.164239 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 10:47:34.208044 master-0 kubenswrapper[17876]: I0313 10:47:34.207983 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-fsw7z" Mar 13 10:47:34.226729 master-0 kubenswrapper[17876]: I0313 10:47:34.226655 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 10:47:34.261531 master-0 kubenswrapper[17876]: I0313 10:47:34.261459 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:47:34.285306 master-0 kubenswrapper[17876]: I0313 10:47:34.285248 17876 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:47:34.289695 master-0 kubenswrapper[17876]: I0313 10:47:34.289668 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:47:34.294243 master-0 kubenswrapper[17876]: I0313 10:47:34.294199 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:47:34.355816 master-0 kubenswrapper[17876]: I0313 10:47:34.355757 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 10:47:34.441202 master-0 kubenswrapper[17876]: I0313 10:47:34.441022 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 10:47:34.536859 master-0 kubenswrapper[17876]: I0313 10:47:34.536805 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:47:34.536859 master-0 kubenswrapper[17876]: I0313 10:47:34.536835 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:47:34.544844 master-0 kubenswrapper[17876]: I0313 10:47:34.544799 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:47:34.545842 master-0 kubenswrapper[17876]: I0313 10:47:34.545811 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 10:47:34.575407 master-0 kubenswrapper[17876]: I0313 10:47:34.575336 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:47:34.581133 master-0 kubenswrapper[17876]: I0313 10:47:34.581109 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:47:34.641906 master-0 kubenswrapper[17876]: I0313 10:47:34.641834 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 10:47:34.657685 master-0 kubenswrapper[17876]: I0313 10:47:34.657616 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 10:47:34.661487 master-0 kubenswrapper[17876]: I0313 10:47:34.661419 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:47:34.686278 master-0 kubenswrapper[17876]: I0313 10:47:34.686216 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 10:47:34.749548 master-0 kubenswrapper[17876]: I0313 10:47:34.749398 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 10:47:34.770474 master-0 kubenswrapper[17876]: I0313 10:47:34.770420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 10:47:34.895351 master-0 kubenswrapper[17876]: I0313 10:47:34.895285 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:47:34.908171 master-0 kubenswrapper[17876]: I0313 10:47:34.908084 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 10:47:34.910023 master-0 kubenswrapper[17876]: I0313 10:47:34.909987 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 10:47:35.097729 master-0 kubenswrapper[17876]: I0313 10:47:35.096344 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:47:35.097729 master-0 kubenswrapper[17876]: I0313 10:47:35.096751 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:47:35.207232 master-0 kubenswrapper[17876]: I0313 10:47:35.207170 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 10:47:35.211087 master-0 kubenswrapper[17876]: I0313 10:47:35.211012 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:47:35.212075 master-0 kubenswrapper[17876]: I0313 10:47:35.212039 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:47:35.212213 master-0 kubenswrapper[17876]: I0313 10:47:35.212179 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-5lpgr" Mar 13 10:47:35.216085 master-0 kubenswrapper[17876]: I0313 10:47:35.216055 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:47:35.217473 master-0 kubenswrapper[17876]: I0313 10:47:35.217436 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:47:35.277146 master-0 kubenswrapper[17876]: I0313 10:47:35.277071 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 10:47:35.388318 master-0 kubenswrapper[17876]: I0313 10:47:35.388264 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:47:35.424388 master-0 kubenswrapper[17876]: I0313 10:47:35.424324 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:47:35.476951 master-0 kubenswrapper[17876]: I0313 10:47:35.476892 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:47:35.483112 master-0 kubenswrapper[17876]: I0313 10:47:35.483046 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:47:35.574615 master-0 kubenswrapper[17876]: I0313 10:47:35.574541 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 10:47:35.643990 master-0 kubenswrapper[17876]: I0313 10:47:35.643835 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:47:35.683288 master-0 kubenswrapper[17876]: I0313 10:47:35.683224 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-tvfvf" Mar 13 10:47:35.735733 master-0 kubenswrapper[17876]: I0313 10:47:35.735635 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 10:47:35.817664 master-0 kubenswrapper[17876]: I0313 10:47:35.817574 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-vggdd" Mar 13 10:47:35.835056 master-0 kubenswrapper[17876]: I0313 10:47:35.834998 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:47:35.856600 master-0 kubenswrapper[17876]: I0313 10:47:35.856539 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:47:35.900238 master-0 kubenswrapper[17876]: I0313 10:47:35.900108 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:47:35.903521 master-0 kubenswrapper[17876]: I0313 10:47:35.903480 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 10:47:36.011045 master-0 kubenswrapper[17876]: I0313 10:47:36.010994 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 10:47:36.027459 master-0 kubenswrapper[17876]: I0313 10:47:36.027419 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:47:36.041546 master-0 kubenswrapper[17876]: I0313 10:47:36.041509 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:47:36.093647 master-0 kubenswrapper[17876]: I0313 10:47:36.093587 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 10:47:36.186226 master-0 kubenswrapper[17876]: I0313 10:47:36.186077 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 10:47:36.189338 master-0 kubenswrapper[17876]: I0313 10:47:36.189295 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x4n7x" Mar 13 10:47:36.220064 master-0 kubenswrapper[17876]: I0313 10:47:36.219968 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 10:47:36.289647 master-0 kubenswrapper[17876]: I0313 10:47:36.246300 17876 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:47:36.289647 master-0 kubenswrapper[17876]: I0313 10:47:36.279471 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-hggr7" Mar 13 10:47:36.292180 master-0 kubenswrapper[17876]: I0313 10:47:36.291244 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:47:36.355130 master-0 kubenswrapper[17876]: I0313 10:47:36.354842 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:47:36.394692 master-0 kubenswrapper[17876]: I0313 10:47:36.394125 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6g1360gbh170n" Mar 13 10:47:36.431433 master-0 kubenswrapper[17876]: I0313 10:47:36.431396 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 10:47:36.437702 master-0 kubenswrapper[17876]: I0313 10:47:36.437642 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:47:36.462136 master-0 kubenswrapper[17876]: I0313 10:47:36.462112 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:47:36.517288 master-0 kubenswrapper[17876]: I0313 10:47:36.517252 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-24kvc" Mar 13 10:47:36.527000 master-0 kubenswrapper[17876]: I0313 10:47:36.526953 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 10:47:36.668040 master-0 kubenswrapper[17876]: I0313 10:47:36.667665 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsbqr" Mar 13 10:47:36.731949 master-0 kubenswrapper[17876]: I0313 10:47:36.731829 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:47:36.736660 master-0 kubenswrapper[17876]: I0313 10:47:36.736581 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 10:47:36.798870 master-0 kubenswrapper[17876]: I0313 10:47:36.798823 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:47:36.807943 master-0 kubenswrapper[17876]: I0313 10:47:36.807888 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:47:36.820859 master-0 kubenswrapper[17876]: I0313 10:47:36.820756 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:47:36.851825 master-0 kubenswrapper[17876]: I0313 10:47:36.851751 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:47:36.890817 master-0 kubenswrapper[17876]: I0313 10:47:36.890470 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:47:36.913593 master-0 kubenswrapper[17876]: I0313 10:47:36.913515 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 10:47:36.941542 master-0 kubenswrapper[17876]: I0313 10:47:36.941477 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 10:47:36.990174 master-0 kubenswrapper[17876]: I0313 10:47:36.989999 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:47:37.043394 master-0 kubenswrapper[17876]: I0313 10:47:37.043320 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 10:47:37.049126 master-0 kubenswrapper[17876]: I0313 10:47:37.048575 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:47:37.082074 master-0 kubenswrapper[17876]: I0313 10:47:37.082000 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-r9v82" Mar 13 10:47:37.095989 master-0 kubenswrapper[17876]: I0313 10:47:37.095899 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 10:47:37.104558 master-0 kubenswrapper[17876]: I0313 10:47:37.104512 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 10:47:37.115077 master-0 kubenswrapper[17876]: I0313 10:47:37.115007 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 10:47:37.128630 master-0 kubenswrapper[17876]: I0313 10:47:37.128485 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 10:47:37.140826 master-0 kubenswrapper[17876]: I0313 10:47:37.140752 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:47:37.220835 master-0 kubenswrapper[17876]: I0313 10:47:37.220786 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 10:47:37.226069 master-0 kubenswrapper[17876]: I0313 10:47:37.226039 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 10:47:37.247984 master-0 kubenswrapper[17876]: I0313 10:47:37.247865 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 10:47:37.266645 master-0 kubenswrapper[17876]: I0313 10:47:37.266576 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:47:37.276774 master-0 kubenswrapper[17876]: I0313 10:47:37.276729 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:47:37.295565 master-0 kubenswrapper[17876]: I0313 10:47:37.295504 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 10:47:37.307213 master-0 kubenswrapper[17876]: I0313 10:47:37.307129 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:47:37.344212 master-0 kubenswrapper[17876]: I0313 10:47:37.344140 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:47:37.346830 master-0 kubenswrapper[17876]: I0313 10:47:37.346794 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:47:37.362010 master-0 kubenswrapper[17876]: I0313 10:47:37.361951 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 10:47:37.421791 master-0 kubenswrapper[17876]: I0313 10:47:37.421744 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:47:37.539691 master-0 kubenswrapper[17876]: I0313 10:47:37.539559 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:47:37.588119 master-0 kubenswrapper[17876]: I0313 10:47:37.584212 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 10:47:37.649948 master-0 kubenswrapper[17876]: I0313 10:47:37.649886 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:47:37.709997 master-0 kubenswrapper[17876]: I0313 10:47:37.709928 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:47:37.730227 master-0 kubenswrapper[17876]: I0313 10:47:37.730189 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:47:37.735893 master-0 kubenswrapper[17876]: I0313 10:47:37.735846 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 10:47:37.737694 master-0 kubenswrapper[17876]: I0313 10:47:37.737513 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 10:47:37.849350 master-0 kubenswrapper[17876]: I0313 10:47:37.849036 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:47:37.905057 master-0 kubenswrapper[17876]: I0313 10:47:37.905002 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 10:47:37.924981 master-0 kubenswrapper[17876]: I0313 10:47:37.924850 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:47:38.003984 master-0 kubenswrapper[17876]: I0313 10:47:38.003894 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:47:38.014229 master-0 kubenswrapper[17876]: I0313 10:47:38.012415 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:47:38.022839 master-0 kubenswrapper[17876]: I0313 10:47:38.022776 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 10:47:38.050749 master-0 kubenswrapper[17876]: I0313 10:47:38.050687 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:47:38.110187 master-0 kubenswrapper[17876]: I0313 10:47:38.110035 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-jpsrw" Mar 13 10:47:38.144320 master-0 kubenswrapper[17876]: I0313 10:47:38.144246 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 10:47:38.188530 master-0 kubenswrapper[17876]: I0313 10:47:38.188470 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:47:38.190512 master-0 kubenswrapper[17876]: I0313 10:47:38.190456 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:47:38.224960 master-0 kubenswrapper[17876]: I0313 10:47:38.224907 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 10:47:38.259803 master-0 kubenswrapper[17876]: I0313 10:47:38.259708 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 10:47:38.299265 master-0 kubenswrapper[17876]: I0313 10:47:38.299208 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 10:47:38.323224 master-0 kubenswrapper[17876]: I0313 10:47:38.323184 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 10:47:38.347987 master-0 kubenswrapper[17876]: I0313 10:47:38.347923 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 10:47:38.435980 master-0 kubenswrapper[17876]: I0313 10:47:38.435893 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:47:38.471971 master-0 kubenswrapper[17876]: I0313 10:47:38.471869 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 10:47:38.506469 master-0 kubenswrapper[17876]: I0313 10:47:38.506124 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 10:47:38.551007 master-0 kubenswrapper[17876]: I0313 10:47:38.550960 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 10:47:38.571116 master-0 kubenswrapper[17876]: I0313 10:47:38.571028 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 10:47:38.604596 master-0 kubenswrapper[17876]: I0313 10:47:38.604540 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 10:47:38.654159 master-0 kubenswrapper[17876]: I0313 10:47:38.654076 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:47:38.657341 master-0 kubenswrapper[17876]: I0313 10:47:38.657295 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 10:47:38.673419 master-0 kubenswrapper[17876]: I0313 10:47:38.673352 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:47:38.679387 master-0 kubenswrapper[17876]: I0313 10:47:38.679318 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:47:38.708535 master-0 kubenswrapper[17876]: I0313 10:47:38.708403 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:47:38.732255 master-0 kubenswrapper[17876]: I0313 10:47:38.732210 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 10:47:38.733232 master-0 kubenswrapper[17876]: I0313 10:47:38.733189 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 10:47:38.737988 master-0 kubenswrapper[17876]: I0313 10:47:38.737937 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:47:38.773791 master-0 kubenswrapper[17876]: I0313 10:47:38.773714 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:47:38.827619 master-0 kubenswrapper[17876]: I0313 10:47:38.827554 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:47:38.847450 master-0 kubenswrapper[17876]: I0313 10:47:38.847375 17876 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:47:38.971913 master-0 kubenswrapper[17876]: I0313 10:47:38.971791 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-9r4nm" Mar 13 10:47:39.005499 master-0 kubenswrapper[17876]: I0313 10:47:39.005440 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-kkkpw" Mar 13 10:47:39.016029 master-0 kubenswrapper[17876]: I0313 10:47:39.015881 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 10:47:39.036196 master-0 kubenswrapper[17876]: I0313 10:47:39.036146 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 13 10:47:39.110674 master-0 kubenswrapper[17876]: I0313 10:47:39.110510 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-clxlg" Mar 13 10:47:39.142023 master-0 kubenswrapper[17876]: I0313 10:47:39.141971 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 10:47:39.161123 master-0 kubenswrapper[17876]: I0313 10:47:39.161046 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:47:39.231182 master-0 kubenswrapper[17876]: I0313 10:47:39.230473 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:47:39.231824 master-0 kubenswrapper[17876]: I0313 10:47:39.231806 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:47:39.263564 master-0 kubenswrapper[17876]: I0313 10:47:39.263516 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 10:47:39.272782 master-0 kubenswrapper[17876]: I0313 10:47:39.272660 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 10:47:39.346725 master-0 kubenswrapper[17876]: I0313 10:47:39.346669 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:47:39.355197 master-0 kubenswrapper[17876]: I0313 10:47:39.355154 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:47:39.406796 master-0 kubenswrapper[17876]: I0313 10:47:39.406727 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:47:39.417668 master-0 kubenswrapper[17876]: I0313 10:47:39.417632 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 10:47:39.426985 master-0 kubenswrapper[17876]: I0313 10:47:39.426955 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:47:39.431721 master-0 kubenswrapper[17876]: I0313 10:47:39.431698 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 10:47:39.464163 master-0 kubenswrapper[17876]: I0313 10:47:39.464085 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:47:39.470255 master-0 kubenswrapper[17876]: I0313 10:47:39.470214 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 10:47:39.570977 master-0 kubenswrapper[17876]: I0313 10:47:39.570815 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 10:47:39.624650 master-0 kubenswrapper[17876]: I0313 10:47:39.624592 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 10:47:39.699035 master-0 kubenswrapper[17876]: I0313 10:47:39.698937 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 10:47:39.710711 master-0 kubenswrapper[17876]: I0313 10:47:39.710618 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 10:47:39.717925 master-0 kubenswrapper[17876]: I0313 10:47:39.717819 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 10:47:39.727748 master-0 kubenswrapper[17876]: I0313 10:47:39.727641 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:47:39.755382 master-0 kubenswrapper[17876]: I0313 10:47:39.755311 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:47:39.755907 master-0 kubenswrapper[17876]: I0313 10:47:39.755871 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-t57pn" Mar 13 10:47:39.837084 master-0 kubenswrapper[17876]: I0313 10:47:39.836926 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 10:47:39.959714 master-0 kubenswrapper[17876]: I0313 10:47:39.959639 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:47:40.008528 master-0 kubenswrapper[17876]: I0313 10:47:40.008466 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:47:40.025839 master-0 kubenswrapper[17876]: I0313 10:47:40.025713 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 10:47:40.041494 master-0 kubenswrapper[17876]: I0313 10:47:40.041400 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:47:40.064460 master-0 kubenswrapper[17876]: I0313 10:47:40.064400 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 10:47:40.133765 master-0 kubenswrapper[17876]: I0313 10:47:40.133453 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:47:40.134946 master-0 kubenswrapper[17876]: I0313 10:47:40.134153 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:47:40.162857 master-0 kubenswrapper[17876]: I0313 10:47:40.162761 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 10:47:40.205023 master-0 kubenswrapper[17876]: I0313 10:47:40.204962 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:47:40.214057 master-0 kubenswrapper[17876]: I0313 10:47:40.213996 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:47:40.235233 master-0 kubenswrapper[17876]: I0313 10:47:40.235160 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-2p9p4" Mar 13 10:47:40.279626 master-0 kubenswrapper[17876]: I0313 10:47:40.279573 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:47:40.319366 master-0 kubenswrapper[17876]: I0313 10:47:40.319274 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 10:47:40.349079 master-0 kubenswrapper[17876]: I0313 10:47:40.349020 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-894vf" Mar 13 10:47:40.399343 master-0 kubenswrapper[17876]: I0313 10:47:40.399192 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 10:47:40.408396 master-0 kubenswrapper[17876]: I0313 10:47:40.408341 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-fi4otoct6tdgf" Mar 13 10:47:40.705453 master-0 kubenswrapper[17876]: I0313 10:47:40.705341 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 10:47:40.779297 master-0 kubenswrapper[17876]: I0313 10:47:40.779211 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:47:40.819580 master-0 kubenswrapper[17876]: I0313 10:47:40.819525 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:47:40.825839 master-0 kubenswrapper[17876]: I0313 10:47:40.825803 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 10:47:40.837321 master-0 kubenswrapper[17876]: I0313 10:47:40.837268 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 10:47:40.883444 master-0 kubenswrapper[17876]: I0313 10:47:40.883382 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 10:47:40.968286 master-0 kubenswrapper[17876]: I0313 10:47:40.968052 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 10:47:40.976310 master-0 kubenswrapper[17876]: I0313 10:47:40.976259 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:47:41.038538 master-0 kubenswrapper[17876]: I0313 10:47:41.038495 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 10:47:41.052718 master-0 kubenswrapper[17876]: I0313 10:47:41.052666 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 10:47:41.271172 master-0 kubenswrapper[17876]: I0313 10:47:41.270995 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:47:41.399344 master-0 kubenswrapper[17876]: I0313 10:47:41.399275 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:47:41.406903 master-0 kubenswrapper[17876]: I0313 10:47:41.406862 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 10:47:41.411042 master-0 kubenswrapper[17876]: I0313 10:47:41.410995 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 10:47:41.472260 master-0 kubenswrapper[17876]: I0313 10:47:41.472202 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-c2nqj" Mar 13 10:47:41.503404 master-0 kubenswrapper[17876]: I0313 10:47:41.503349 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:47:41.554005 master-0 kubenswrapper[17876]: I0313 10:47:41.553864 17876 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:47:41.585773 master-0 kubenswrapper[17876]: I0313 10:47:41.585690 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:47:41.642723 master-0 kubenswrapper[17876]: I0313 10:47:41.642661 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 10:47:41.684123 master-0 kubenswrapper[17876]: I0313 10:47:41.684024 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-d2pmx" Mar 13 10:47:41.737566 master-0 kubenswrapper[17876]: I0313 10:47:41.737485 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:47:41.738002 master-0 kubenswrapper[17876]: I0313 10:47:41.737902 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" containerID="cri-o://b3eed89c50140b243696a6b00e9f42b9a2506d5578bc99cda4095ffe7165758d" gracePeriod=5 Mar 13 10:47:41.851748 master-0 kubenswrapper[17876]: I0313 10:47:41.851635 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 10:47:41.891420 master-0 kubenswrapper[17876]: I0313 10:47:41.891356 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:47:41.951708 master-0 kubenswrapper[17876]: I0313 10:47:41.951650 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 10:47:42.082585 master-0 kubenswrapper[17876]: I0313 10:47:42.082401 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 10:47:42.146867 master-0 kubenswrapper[17876]: I0313 10:47:42.146808 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 10:47:42.213036 master-0 kubenswrapper[17876]: I0313 10:47:42.212977 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:47:42.247162 master-0 kubenswrapper[17876]: I0313 10:47:42.247073 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 10:47:42.290749 master-0 kubenswrapper[17876]: I0313 10:47:42.290677 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:47:42.337721 master-0 kubenswrapper[17876]: I0313 10:47:42.337654 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-5258b" Mar 13 10:47:42.445658 master-0 kubenswrapper[17876]: I0313 10:47:42.445537 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:47:42.447524 master-0 kubenswrapper[17876]: I0313 10:47:42.447477 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:47:42.595247 master-0 kubenswrapper[17876]: I0313 10:47:42.595169 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 10:47:42.654537 master-0 kubenswrapper[17876]: I0313 10:47:42.654458 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 10:47:42.711080 master-0 kubenswrapper[17876]: I0313 10:47:42.710971 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 10:47:42.898631 master-0 kubenswrapper[17876]: I0313 10:47:42.898575 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:47:42.912678 master-0 kubenswrapper[17876]: I0313 10:47:42.912617 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:47:42.969431 master-0 kubenswrapper[17876]: I0313 10:47:42.969319 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 10:47:43.035811 master-0 kubenswrapper[17876]: I0313 10:47:43.035759 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 10:47:43.047279 master-0 kubenswrapper[17876]: I0313 10:47:43.047219 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 10:47:43.069416 master-0 kubenswrapper[17876]: I0313 10:47:43.069358 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:47:43.133734 master-0 kubenswrapper[17876]: I0313 10:47:43.133661 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-h7hlp" Mar 13 10:47:43.152252 master-0 kubenswrapper[17876]: I0313 10:47:43.152203 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dr74kngie93ut" Mar 13 10:47:43.181358 master-0 kubenswrapper[17876]: I0313 10:47:43.181317 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dvqsb" Mar 13 10:47:43.273613 master-0 kubenswrapper[17876]: I0313 10:47:43.273498 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:47:43.294536 master-0 kubenswrapper[17876]: I0313 10:47:43.294486 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:47:43.302348 master-0 kubenswrapper[17876]: I0313 10:47:43.302293 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 10:47:43.337492 master-0 kubenswrapper[17876]: I0313 10:47:43.337445 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 10:47:43.356249 master-0 kubenswrapper[17876]: I0313 10:47:43.356204 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:47:43.404448 master-0 kubenswrapper[17876]: I0313 10:47:43.404388 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:47:43.579906 master-0 kubenswrapper[17876]: I0313 10:47:43.579769 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 10:47:43.592651 master-0 kubenswrapper[17876]: I0313 10:47:43.592585 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 10:47:43.596289 master-0 kubenswrapper[17876]: I0313 10:47:43.596263 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:47:43.629942 master-0 kubenswrapper[17876]: I0313 10:47:43.629864 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 10:47:43.694830 master-0 kubenswrapper[17876]: I0313 10:47:43.694725 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 10:47:43.798975 master-0 kubenswrapper[17876]: I0313 10:47:43.798922 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:47:43.835419 master-0 kubenswrapper[17876]: I0313 10:47:43.835290 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:47:43.856865 master-0 kubenswrapper[17876]: I0313 10:47:43.856803 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 10:47:43.892708 master-0 kubenswrapper[17876]: I0313 10:47:43.892648 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 10:47:43.899835 master-0 kubenswrapper[17876]: I0313 10:47:43.899766 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:47:43.918561 master-0 kubenswrapper[17876]: I0313 10:47:43.918496 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:47:44.025737 master-0 kubenswrapper[17876]: I0313 10:47:44.025632 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:47:44.033486 master-0 kubenswrapper[17876]: I0313 10:47:44.033437 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:47:44.168006 master-0 kubenswrapper[17876]: I0313 10:47:44.167923 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:47:44.287176 master-0 kubenswrapper[17876]: I0313 10:47:44.287107 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:47:44.308590 master-0 kubenswrapper[17876]: I0313 10:47:44.308523 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 10:47:44.401524 master-0 kubenswrapper[17876]: I0313 10:47:44.401475 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 10:47:44.969666 master-0 kubenswrapper[17876]: I0313 10:47:44.969591 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 10:47:45.119704 master-0 kubenswrapper[17876]: I0313 10:47:45.119637 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:47:45.228757 master-0 kubenswrapper[17876]: I0313 10:47:45.228611 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:47:45.500925 master-0 kubenswrapper[17876]: I0313 10:47:45.500756 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 10:47:45.856599 master-0 kubenswrapper[17876]: I0313 10:47:45.856490 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 10:47:46.102197 master-0 kubenswrapper[17876]: I0313 10:47:46.102071 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 10:47:46.348424 master-0 kubenswrapper[17876]: I0313 10:47:46.348361 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:47:46.644797 master-0 kubenswrapper[17876]: I0313 10:47:46.644751 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:47:46.887564 master-0 kubenswrapper[17876]: I0313 10:47:46.887499 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:47:47.068980 master-0 kubenswrapper[17876]: I0313 10:47:47.068863 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_3a18cac8a90d6913a6a0391d805cddc9/startup-monitor/0.log" Mar 13 10:47:47.068980 master-0 kubenswrapper[17876]: I0313 10:47:47.068929 17876 generic.go:334] "Generic (PLEG): container finished" podID="3a18cac8a90d6913a6a0391d805cddc9" containerID="b3eed89c50140b243696a6b00e9f42b9a2506d5578bc99cda4095ffe7165758d" exitCode=137 Mar 13 10:47:47.322918 master-0 kubenswrapper[17876]: I0313 10:47:47.322794 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_3a18cac8a90d6913a6a0391d805cddc9/startup-monitor/0.log" Mar 13 10:47:47.322918 master-0 kubenswrapper[17876]: I0313 10:47:47.322894 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:47:47.499708 master-0 kubenswrapper[17876]: I0313 10:47:47.499630 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 13 10:47:47.500523 master-0 kubenswrapper[17876]: I0313 10:47:47.499763 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 13 10:47:47.500523 master-0 kubenswrapper[17876]: I0313 10:47:47.500357 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:47.500925 master-0 kubenswrapper[17876]: I0313 10:47:47.500573 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 13 10:47:47.500925 master-0 kubenswrapper[17876]: I0313 10:47:47.500850 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 13 10:47:47.500925 master-0 kubenswrapper[17876]: I0313 10:47:47.500898 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 13 10:47:47.501128 master-0 kubenswrapper[17876]: I0313 10:47:47.500805 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests" (OuterVolumeSpecName: "manifests") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:47.501128 master-0 kubenswrapper[17876]: I0313 10:47:47.500995 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock" (OuterVolumeSpecName: "var-lock") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:47.501128 master-0 kubenswrapper[17876]: I0313 10:47:47.501063 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log" (OuterVolumeSpecName: "var-log") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:47.501454 master-0 kubenswrapper[17876]: I0313 10:47:47.501415 17876 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:47.501454 master-0 kubenswrapper[17876]: I0313 10:47:47.501442 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:47.501454 master-0 kubenswrapper[17876]: I0313 10:47:47.501454 17876 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:47.501628 master-0 kubenswrapper[17876]: I0313 10:47:47.501467 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:47.509495 master-0 kubenswrapper[17876]: I0313 10:47:47.509428 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:47:47.602047 master-0 kubenswrapper[17876]: I0313 10:47:47.601900 17876 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:47:48.078776 master-0 kubenswrapper[17876]: I0313 10:47:48.078692 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_3a18cac8a90d6913a6a0391d805cddc9/startup-monitor/0.log" Mar 13 10:47:48.079139 master-0 kubenswrapper[17876]: I0313 10:47:48.078798 17876 scope.go:117] "RemoveContainer" containerID="b3eed89c50140b243696a6b00e9f42b9a2506d5578bc99cda4095ffe7165758d" Mar 13 10:47:48.079139 master-0 kubenswrapper[17876]: I0313 10:47:48.078971 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:47:48.501421 master-0 kubenswrapper[17876]: I0313 10:47:48.501359 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a18cac8a90d6913a6a0391d805cddc9" path="/var/lib/kubelet/pods/3a18cac8a90d6913a6a0391d805cddc9/volumes" Mar 13 10:48:26.373651 master-0 kubenswrapper[17876]: I0313 10:48:26.373545 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: E0313 10:48:26.373976 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374005 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: E0313 10:48:26.374035 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374041 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: E0313 10:48:26.374059 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374065 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374261 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd72828c-ef4f-4cfb-88ea-5e7c7d45c960" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374282 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f57bfc81-1c24-4b56-be43-08a173a82b76" containerName="installer" Mar 13 10:48:26.376019 master-0 kubenswrapper[17876]: I0313 10:48:26.374299 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 13 10:48:26.376972 master-0 kubenswrapper[17876]: I0313 10:48:26.376927 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.379460 master-0 kubenswrapper[17876]: I0313 10:48:26.379418 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 13 10:48:26.379699 master-0 kubenswrapper[17876]: I0313 10:48:26.379680 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 13 10:48:26.379817 master-0 kubenswrapper[17876]: I0313 10:48:26.379800 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 13 10:48:26.379933 master-0 kubenswrapper[17876]: I0313 10:48:26.379916 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 13 10:48:26.380206 master-0 kubenswrapper[17876]: I0313 10:48:26.380179 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-7cjkc" Mar 13 10:48:26.380509 master-0 kubenswrapper[17876]: I0313 10:48:26.380474 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 13 10:48:26.385012 master-0 kubenswrapper[17876]: I0313 10:48:26.384965 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 13 10:48:26.400145 master-0 kubenswrapper[17876]: I0313 10:48:26.399680 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:48:26.496347 master-0 kubenswrapper[17876]: I0313 10:48:26.496277 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496347 master-0 kubenswrapper[17876]: I0313 10:48:26.496345 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496438 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496494 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496553 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496589 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496618 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.496734 master-0 kubenswrapper[17876]: I0313 10:48:26.496643 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzl9p\" (UniqueName: \"kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.597592 master-0 kubenswrapper[17876]: I0313 10:48:26.597496 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.597592 master-0 kubenswrapper[17876]: I0313 10:48:26.597605 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598361 master-0 kubenswrapper[17876]: I0313 10:48:26.597824 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598361 master-0 kubenswrapper[17876]: I0313 10:48:26.598341 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598754 master-0 kubenswrapper[17876]: I0313 10:48:26.598388 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzl9p\" (UniqueName: \"kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598754 master-0 kubenswrapper[17876]: I0313 10:48:26.598446 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598754 master-0 kubenswrapper[17876]: I0313 10:48:26.598495 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.598754 master-0 kubenswrapper[17876]: I0313 10:48:26.598573 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.602652 master-0 kubenswrapper[17876]: I0313 10:48:26.602252 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.603754 master-0 kubenswrapper[17876]: I0313 10:48:26.603687 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.615551 master-0 kubenswrapper[17876]: I0313 10:48:26.605527 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.615551 master-0 kubenswrapper[17876]: I0313 10:48:26.613004 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.615551 master-0 kubenswrapper[17876]: I0313 10:48:26.614906 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.615551 master-0 kubenswrapper[17876]: I0313 10:48:26.615413 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.620328 master-0 kubenswrapper[17876]: I0313 10:48:26.620003 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.647445 master-0 kubenswrapper[17876]: I0313 10:48:26.647317 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzl9p\" (UniqueName: \"kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p\") pod \"telemeter-client-6644589945-r7t4l\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:26.714628 master-0 kubenswrapper[17876]: I0313 10:48:26.714547 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:48:27.285758 master-0 kubenswrapper[17876]: I0313 10:48:27.285659 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:48:27.301885 master-0 kubenswrapper[17876]: I0313 10:48:27.301849 17876 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:48:27.407124 master-0 kubenswrapper[17876]: I0313 10:48:27.407020 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerStarted","Data":"ef7d8004c639f1641cf26738f7adb78f5d72c09997b6c7788ef2750d25c63a07"} Mar 13 10:48:30.451775 master-0 kubenswrapper[17876]: I0313 10:48:30.451709 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerStarted","Data":"7a15eb5240bfdd7af38a32fca27f9dc99472114c912a39deca7e2c742392f9c0"} Mar 13 10:48:30.451775 master-0 kubenswrapper[17876]: I0313 10:48:30.451777 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerStarted","Data":"e79967d0b76f0c0562a0acc8d7863ef6baf6011d87f652e4e181a797f0f82df2"} Mar 13 10:48:30.452527 master-0 kubenswrapper[17876]: I0313 10:48:30.451792 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerStarted","Data":"908b7375b91dcf2baab20c05c0f72cd3fc66c40e43b8f4f484d6e4c6f9345dde"} Mar 13 10:48:31.398997 master-0 kubenswrapper[17876]: I0313 10:48:31.398885 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" podStartSLOduration=3.342766336 podStartE2EDuration="5.398839723s" podCreationTimestamp="2026-03-13 10:48:26 +0000 UTC" firstStartedPulling="2026-03-13 10:48:27.301721286 +0000 UTC m=+415.137527762" lastFinishedPulling="2026-03-13 10:48:29.357794673 +0000 UTC m=+417.193601149" observedRunningTime="2026-03-13 10:48:30.485432465 +0000 UTC m=+418.321238941" watchObservedRunningTime="2026-03-13 10:48:31.398839723 +0000 UTC m=+419.234646199" Mar 13 10:48:31.402856 master-0 kubenswrapper[17876]: I0313 10:48:31.402787 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:48:31.404023 master-0 kubenswrapper[17876]: I0313 10:48:31.403976 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.436357 master-0 kubenswrapper[17876]: I0313 10:48:31.436273 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:48:31.527418 master-0 kubenswrapper[17876]: I0313 10:48:31.527329 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527448 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527504 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527566 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdt6b\" (UniqueName: \"kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527597 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527634 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.528379 master-0 kubenswrapper[17876]: I0313 10:48:31.527668 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.629077 master-0 kubenswrapper[17876]: I0313 10:48:31.629020 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.629454 master-0 kubenswrapper[17876]: I0313 10:48:31.629412 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.629566 master-0 kubenswrapper[17876]: I0313 10:48:31.629544 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.629702 master-0 kubenswrapper[17876]: I0313 10:48:31.629678 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdt6b\" (UniqueName: \"kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.629833 master-0 kubenswrapper[17876]: I0313 10:48:31.629794 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.630164 master-0 kubenswrapper[17876]: I0313 10:48:31.630138 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.630447 master-0 kubenswrapper[17876]: I0313 10:48:31.630412 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.631651 master-0 kubenswrapper[17876]: I0313 10:48:31.630845 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.631651 master-0 kubenswrapper[17876]: I0313 10:48:31.630965 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.631651 master-0 kubenswrapper[17876]: I0313 10:48:31.631429 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.632351 master-0 kubenswrapper[17876]: I0313 10:48:31.632305 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.635235 master-0 kubenswrapper[17876]: I0313 10:48:31.635205 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.638879 master-0 kubenswrapper[17876]: I0313 10:48:31.638848 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.645487 master-0 kubenswrapper[17876]: I0313 10:48:31.645461 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdt6b\" (UniqueName: \"kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b\") pod \"console-6d4c7bc587-hj6ht\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:31.731685 master-0 kubenswrapper[17876]: I0313 10:48:31.731524 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:32.183187 master-0 kubenswrapper[17876]: I0313 10:48:32.182472 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:48:32.221960 master-0 kubenswrapper[17876]: I0313 10:48:32.221901 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:48:32.238238 master-0 kubenswrapper[17876]: I0313 10:48:32.238168 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:48:32.239462 master-0 kubenswrapper[17876]: I0313 10:48:32.239429 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.242883 master-0 kubenswrapper[17876]: I0313 10:48:32.242734 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8xtz\" (UniqueName: \"kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.242883 master-0 kubenswrapper[17876]: I0313 10:48:32.242809 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.242883 master-0 kubenswrapper[17876]: I0313 10:48:32.242852 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.243219 master-0 kubenswrapper[17876]: I0313 10:48:32.242915 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.243219 master-0 kubenswrapper[17876]: I0313 10:48:32.242950 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.243219 master-0 kubenswrapper[17876]: I0313 10:48:32.242978 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.243219 master-0 kubenswrapper[17876]: I0313 10:48:32.243033 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.243725 master-0 kubenswrapper[17876]: I0313 10:48:32.243680 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:48:32.344925 master-0 kubenswrapper[17876]: I0313 10:48:32.344720 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8xtz\" (UniqueName: \"kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.344925 master-0 kubenswrapper[17876]: I0313 10:48:32.344765 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.344925 master-0 kubenswrapper[17876]: I0313 10:48:32.344820 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.344925 master-0 kubenswrapper[17876]: I0313 10:48:32.344915 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.345202 master-0 kubenswrapper[17876]: I0313 10:48:32.344983 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.345202 master-0 kubenswrapper[17876]: I0313 10:48:32.345034 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.345202 master-0 kubenswrapper[17876]: I0313 10:48:32.345079 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.346143 master-0 kubenswrapper[17876]: I0313 10:48:32.346078 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.346556 master-0 kubenswrapper[17876]: I0313 10:48:32.346520 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.346724 master-0 kubenswrapper[17876]: I0313 10:48:32.346645 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.347004 master-0 kubenswrapper[17876]: I0313 10:48:32.346970 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.348495 master-0 kubenswrapper[17876]: I0313 10:48:32.348459 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.352121 master-0 kubenswrapper[17876]: I0313 10:48:32.348925 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.363513 master-0 kubenswrapper[17876]: I0313 10:48:32.363477 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8xtz\" (UniqueName: \"kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz\") pod \"console-5ccf58899f-qzj28\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.473735 master-0 kubenswrapper[17876]: I0313 10:48:32.473600 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d4c7bc587-hj6ht" event={"ID":"5b4e6bca-b94c-405f-92ac-7e11d0e32acd","Type":"ContainerStarted","Data":"68fb6a2e5d642c510d9c2d0ee86d8713035602b3bd4f56d6ebd9e55a38f5ec0b"} Mar 13 10:48:32.473735 master-0 kubenswrapper[17876]: I0313 10:48:32.473651 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d4c7bc587-hj6ht" event={"ID":"5b4e6bca-b94c-405f-92ac-7e11d0e32acd","Type":"ContainerStarted","Data":"8d3e12452911a220ce754d8d34e68d140deea1e78b1166cfbc0236dc7de97170"} Mar 13 10:48:32.509061 master-0 kubenswrapper[17876]: I0313 10:48:32.508970 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6d4c7bc587-hj6ht" podStartSLOduration=1.5089363439999999 podStartE2EDuration="1.508936344s" podCreationTimestamp="2026-03-13 10:48:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:48:32.50353636 +0000 UTC m=+420.339342836" watchObservedRunningTime="2026-03-13 10:48:32.508936344 +0000 UTC m=+420.344742820" Mar 13 10:48:32.587161 master-0 kubenswrapper[17876]: I0313 10:48:32.587057 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:32.989319 master-0 kubenswrapper[17876]: I0313 10:48:32.988959 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:48:32.995417 master-0 kubenswrapper[17876]: W0313 10:48:32.995336 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8190215_7d9b_4508_a5cb_cee577d23254.slice/crio-eef66ca75f37f7529919cca71023c61036beb37d3823b688f99f9ce2719e844e WatchSource:0}: Error finding container eef66ca75f37f7529919cca71023c61036beb37d3823b688f99f9ce2719e844e: Status 404 returned error can't find the container with id eef66ca75f37f7529919cca71023c61036beb37d3823b688f99f9ce2719e844e Mar 13 10:48:33.484125 master-0 kubenswrapper[17876]: I0313 10:48:33.484048 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccf58899f-qzj28" event={"ID":"f8190215-7d9b-4508-a5cb-cee577d23254","Type":"ContainerStarted","Data":"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea"} Mar 13 10:48:33.484436 master-0 kubenswrapper[17876]: I0313 10:48:33.484397 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccf58899f-qzj28" event={"ID":"f8190215-7d9b-4508-a5cb-cee577d23254","Type":"ContainerStarted","Data":"eef66ca75f37f7529919cca71023c61036beb37d3823b688f99f9ce2719e844e"} Mar 13 10:48:40.016129 master-0 kubenswrapper[17876]: I0313 10:48:40.015979 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5ccf58899f-qzj28" podStartSLOduration=8.015960973 podStartE2EDuration="8.015960973s" podCreationTimestamp="2026-03-13 10:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:48:33.510264847 +0000 UTC m=+421.346071333" watchObservedRunningTime="2026-03-13 10:48:40.015960973 +0000 UTC m=+427.851767449" Mar 13 10:48:40.017857 master-0 kubenswrapper[17876]: I0313 10:48:40.017258 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:48:40.057152 master-0 kubenswrapper[17876]: I0313 10:48:40.056788 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-59c5df894-vld5f"] Mar 13 10:48:40.057911 master-0 kubenswrapper[17876]: I0313 10:48:40.057648 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.083297 master-0 kubenswrapper[17876]: I0313 10:48:40.083243 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5df894-vld5f"] Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242202 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-service-ca\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242261 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-trusted-ca-bundle\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242292 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-oauth-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242312 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qdqrk\" (UniqueName: \"kubernetes.io/projected/342d69d8-d283-4b5b-ac00-42e8048c4ab2-kube-api-access-qdqrk\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242330 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-oauth-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242356 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.246238 master-0 kubenswrapper[17876]: I0313 10:48:40.242386 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.344170 master-0 kubenswrapper[17876]: I0313 10:48:40.343984 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-service-ca\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.344170 master-0 kubenswrapper[17876]: I0313 10:48:40.344064 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-trusted-ca-bundle\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.344558 master-0 kubenswrapper[17876]: I0313 10:48:40.344534 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-oauth-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.344780 master-0 kubenswrapper[17876]: I0313 10:48:40.344761 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qdqrk\" (UniqueName: \"kubernetes.io/projected/342d69d8-d283-4b5b-ac00-42e8048c4ab2-kube-api-access-qdqrk\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.344893 master-0 kubenswrapper[17876]: I0313 10:48:40.344880 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-oauth-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.345199 master-0 kubenswrapper[17876]: I0313 10:48:40.345161 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-service-ca\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.345324 master-0 kubenswrapper[17876]: I0313 10:48:40.345256 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.345410 master-0 kubenswrapper[17876]: I0313 10:48:40.345387 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.346048 master-0 kubenswrapper[17876]: I0313 10:48:40.345993 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-trusted-ca-bundle\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.346252 master-0 kubenswrapper[17876]: I0313 10:48:40.346206 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-oauth-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.346418 master-0 kubenswrapper[17876]: I0313 10:48:40.346386 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.348582 master-0 kubenswrapper[17876]: I0313 10:48:40.348521 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-serving-cert\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.348908 master-0 kubenswrapper[17876]: I0313 10:48:40.348862 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/342d69d8-d283-4b5b-ac00-42e8048c4ab2-console-oauth-config\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.362881 master-0 kubenswrapper[17876]: I0313 10:48:40.362822 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qdqrk\" (UniqueName: \"kubernetes.io/projected/342d69d8-d283-4b5b-ac00-42e8048c4ab2-kube-api-access-qdqrk\") pod \"console-59c5df894-vld5f\" (UID: \"342d69d8-d283-4b5b-ac00-42e8048c4ab2\") " pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.403689 master-0 kubenswrapper[17876]: I0313 10:48:40.403610 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:40.984542 master-0 kubenswrapper[17876]: I0313 10:48:40.983198 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59c5df894-vld5f"] Mar 13 10:48:40.988477 master-0 kubenswrapper[17876]: W0313 10:48:40.988431 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod342d69d8_d283_4b5b_ac00_42e8048c4ab2.slice/crio-37db6285484680a6a1faed9f648d05453db48f44302be3a3de1c1888a6c0e945 WatchSource:0}: Error finding container 37db6285484680a6a1faed9f648d05453db48f44302be3a3de1c1888a6c0e945: Status 404 returned error can't find the container with id 37db6285484680a6a1faed9f648d05453db48f44302be3a3de1c1888a6c0e945 Mar 13 10:48:41.556718 master-0 kubenswrapper[17876]: I0313 10:48:41.556658 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5df894-vld5f" event={"ID":"342d69d8-d283-4b5b-ac00-42e8048c4ab2","Type":"ContainerStarted","Data":"87e0076ee295f995bd071af13d2c544bcef200b56f7a92a46cbe5f962abd44de"} Mar 13 10:48:41.556718 master-0 kubenswrapper[17876]: I0313 10:48:41.556715 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59c5df894-vld5f" event={"ID":"342d69d8-d283-4b5b-ac00-42e8048c4ab2","Type":"ContainerStarted","Data":"37db6285484680a6a1faed9f648d05453db48f44302be3a3de1c1888a6c0e945"} Mar 13 10:48:41.579010 master-0 kubenswrapper[17876]: I0313 10:48:41.578900 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59c5df894-vld5f" podStartSLOduration=1.5788676430000002 podStartE2EDuration="1.578867643s" podCreationTimestamp="2026-03-13 10:48:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:48:41.5735172 +0000 UTC m=+429.409323676" watchObservedRunningTime="2026-03-13 10:48:41.578867643 +0000 UTC m=+429.414674119" Mar 13 10:48:41.733639 master-0 kubenswrapper[17876]: I0313 10:48:41.731959 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:42.588892 master-0 kubenswrapper[17876]: I0313 10:48:42.587941 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:48:50.404466 master-0 kubenswrapper[17876]: I0313 10:48:50.404402 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:50.404466 master-0 kubenswrapper[17876]: I0313 10:48:50.404464 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:50.408703 master-0 kubenswrapper[17876]: I0313 10:48:50.408637 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:50.627266 master-0 kubenswrapper[17876]: I0313 10:48:50.627211 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59c5df894-vld5f" Mar 13 10:48:50.693530 master-0 kubenswrapper[17876]: I0313 10:48:50.693373 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:48:57.459482 master-0 kubenswrapper[17876]: I0313 10:48:57.459406 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 10:48:57.460527 master-0 kubenswrapper[17876]: I0313 10:48:57.460506 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.465888 master-0 kubenswrapper[17876]: I0313 10:48:57.465839 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 13 10:48:57.467019 master-0 kubenswrapper[17876]: I0313 10:48:57.466978 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-z9gwk" Mar 13 10:48:57.477980 master-0 kubenswrapper[17876]: I0313 10:48:57.477548 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 10:48:57.525794 master-0 kubenswrapper[17876]: I0313 10:48:57.525705 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6d4c7bc587-hj6ht" podUID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" containerName="console" containerID="cri-o://68fb6a2e5d642c510d9c2d0ee86d8713035602b3bd4f56d6ebd9e55a38f5ec0b" gracePeriod=15 Mar 13 10:48:57.601199 master-0 kubenswrapper[17876]: I0313 10:48:57.601148 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.601436 master-0 kubenswrapper[17876]: I0313 10:48:57.601222 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.601436 master-0 kubenswrapper[17876]: I0313 10:48:57.601289 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.679651 master-0 kubenswrapper[17876]: I0313 10:48:57.679563 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d4c7bc587-hj6ht_5b4e6bca-b94c-405f-92ac-7e11d0e32acd/console/0.log" Mar 13 10:48:57.679949 master-0 kubenswrapper[17876]: I0313 10:48:57.679724 17876 generic.go:334] "Generic (PLEG): container finished" podID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" containerID="68fb6a2e5d642c510d9c2d0ee86d8713035602b3bd4f56d6ebd9e55a38f5ec0b" exitCode=2 Mar 13 10:48:57.679949 master-0 kubenswrapper[17876]: I0313 10:48:57.679782 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d4c7bc587-hj6ht" event={"ID":"5b4e6bca-b94c-405f-92ac-7e11d0e32acd","Type":"ContainerDied","Data":"68fb6a2e5d642c510d9c2d0ee86d8713035602b3bd4f56d6ebd9e55a38f5ec0b"} Mar 13 10:48:57.702624 master-0 kubenswrapper[17876]: I0313 10:48:57.702530 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.702624 master-0 kubenswrapper[17876]: I0313 10:48:57.702613 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.703001 master-0 kubenswrapper[17876]: I0313 10:48:57.702655 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.703310 master-0 kubenswrapper[17876]: I0313 10:48:57.703252 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.703389 master-0 kubenswrapper[17876]: I0313 10:48:57.703349 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.726708 master-0 kubenswrapper[17876]: I0313 10:48:57.723427 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access\") pod \"installer-5-master-0\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.786502 master-0 kubenswrapper[17876]: I0313 10:48:57.786413 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:48:57.981424 master-0 kubenswrapper[17876]: I0313 10:48:57.981333 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d4c7bc587-hj6ht_5b4e6bca-b94c-405f-92ac-7e11d0e32acd/console/0.log" Mar 13 10:48:57.981746 master-0 kubenswrapper[17876]: I0313 10:48:57.981709 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:58.109336 master-0 kubenswrapper[17876]: I0313 10:48:58.109214 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109604 master-0 kubenswrapper[17876]: I0313 10:48:58.109454 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109604 master-0 kubenswrapper[17876]: I0313 10:48:58.109496 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109604 master-0 kubenswrapper[17876]: I0313 10:48:58.109527 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109604 master-0 kubenswrapper[17876]: I0313 10:48:58.109586 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdt6b\" (UniqueName: \"kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109755 master-0 kubenswrapper[17876]: I0313 10:48:58.109626 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.109755 master-0 kubenswrapper[17876]: I0313 10:48:58.109688 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert\") pod \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\" (UID: \"5b4e6bca-b94c-405f-92ac-7e11d0e32acd\") " Mar 13 10:48:58.110702 master-0 kubenswrapper[17876]: I0313 10:48:58.109759 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:48:58.110702 master-0 kubenswrapper[17876]: I0313 10:48:58.110156 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca" (OuterVolumeSpecName: "service-ca") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:48:58.110702 master-0 kubenswrapper[17876]: I0313 10:48:58.110339 17876 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.110702 master-0 kubenswrapper[17876]: I0313 10:48:58.110355 17876 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.111013 master-0 kubenswrapper[17876]: I0313 10:48:58.110899 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:48:58.111280 master-0 kubenswrapper[17876]: I0313 10:48:58.109997 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config" (OuterVolumeSpecName: "console-config") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:48:58.114329 master-0 kubenswrapper[17876]: I0313 10:48:58.114116 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:48:58.114329 master-0 kubenswrapper[17876]: I0313 10:48:58.114121 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b" (OuterVolumeSpecName: "kube-api-access-gdt6b") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "kube-api-access-gdt6b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:48:58.114329 master-0 kubenswrapper[17876]: I0313 10:48:58.114280 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "5b4e6bca-b94c-405f-92ac-7e11d0e32acd" (UID: "5b4e6bca-b94c-405f-92ac-7e11d0e32acd"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:48:58.212330 master-0 kubenswrapper[17876]: I0313 10:48:58.212259 17876 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.212330 master-0 kubenswrapper[17876]: I0313 10:48:58.212319 17876 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.212330 master-0 kubenswrapper[17876]: I0313 10:48:58.212340 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdt6b\" (UniqueName: \"kubernetes.io/projected/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-kube-api-access-gdt6b\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.212755 master-0 kubenswrapper[17876]: I0313 10:48:58.212354 17876 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.212755 master-0 kubenswrapper[17876]: I0313 10:48:58.212377 17876 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/5b4e6bca-b94c-405f-92ac-7e11d0e32acd-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:48:58.228483 master-0 kubenswrapper[17876]: I0313 10:48:58.228415 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 13 10:48:58.229450 master-0 kubenswrapper[17876]: W0313 10:48:58.229395 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod18c635c3_af41_4465_887c_e0675fabb3e8.slice/crio-b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258 WatchSource:0}: Error finding container b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258: Status 404 returned error can't find the container with id b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258 Mar 13 10:48:58.688893 master-0 kubenswrapper[17876]: I0313 10:48:58.688784 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6d4c7bc587-hj6ht_5b4e6bca-b94c-405f-92ac-7e11d0e32acd/console/0.log" Mar 13 10:48:58.689834 master-0 kubenswrapper[17876]: I0313 10:48:58.689130 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6d4c7bc587-hj6ht" event={"ID":"5b4e6bca-b94c-405f-92ac-7e11d0e32acd","Type":"ContainerDied","Data":"8d3e12452911a220ce754d8d34e68d140deea1e78b1166cfbc0236dc7de97170"} Mar 13 10:48:58.689834 master-0 kubenswrapper[17876]: I0313 10:48:58.689204 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6d4c7bc587-hj6ht" Mar 13 10:48:58.689834 master-0 kubenswrapper[17876]: I0313 10:48:58.689232 17876 scope.go:117] "RemoveContainer" containerID="68fb6a2e5d642c510d9c2d0ee86d8713035602b3bd4f56d6ebd9e55a38f5ec0b" Mar 13 10:48:58.695422 master-0 kubenswrapper[17876]: I0313 10:48:58.695067 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"18c635c3-af41-4465-887c-e0675fabb3e8","Type":"ContainerStarted","Data":"fce27e33f22aa35e474593b031d8023e346af89a338727d0686c327a701ed472"} Mar 13 10:48:58.695422 master-0 kubenswrapper[17876]: I0313 10:48:58.695150 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"18c635c3-af41-4465-887c-e0675fabb3e8","Type":"ContainerStarted","Data":"b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258"} Mar 13 10:48:58.721972 master-0 kubenswrapper[17876]: I0313 10:48:58.721822 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=1.7217987799999999 podStartE2EDuration="1.72179878s" podCreationTimestamp="2026-03-13 10:48:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:48:58.717463497 +0000 UTC m=+446.553269993" watchObservedRunningTime="2026-03-13 10:48:58.72179878 +0000 UTC m=+446.557605256" Mar 13 10:48:58.745181 master-0 kubenswrapper[17876]: I0313 10:48:58.738001 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:48:58.748845 master-0 kubenswrapper[17876]: I0313 10:48:58.748785 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6d4c7bc587-hj6ht"] Mar 13 10:49:00.517790 master-0 kubenswrapper[17876]: I0313 10:49:00.508765 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" path="/var/lib/kubelet/pods/5b4e6bca-b94c-405f-92ac-7e11d0e32acd/volumes" Mar 13 10:49:01.733998 master-0 kubenswrapper[17876]: I0313 10:49:01.733891 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 10:49:01.734848 master-0 kubenswrapper[17876]: E0313 10:49:01.734250 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" containerName="console" Mar 13 10:49:01.734848 master-0 kubenswrapper[17876]: I0313 10:49:01.734263 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" containerName="console" Mar 13 10:49:01.734848 master-0 kubenswrapper[17876]: I0313 10:49:01.734467 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="5b4e6bca-b94c-405f-92ac-7e11d0e32acd" containerName="console" Mar 13 10:49:01.735377 master-0 kubenswrapper[17876]: I0313 10:49:01.735289 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.737812 master-0 kubenswrapper[17876]: I0313 10:49:01.737768 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 13 10:49:01.738418 master-0 kubenswrapper[17876]: I0313 10:49:01.738390 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-hnsk9" Mar 13 10:49:01.745752 master-0 kubenswrapper[17876]: I0313 10:49:01.745638 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 10:49:01.772262 master-0 kubenswrapper[17876]: I0313 10:49:01.772172 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.772262 master-0 kubenswrapper[17876]: I0313 10:49:01.772252 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.772262 master-0 kubenswrapper[17876]: I0313 10:49:01.772274 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.879162 master-0 kubenswrapper[17876]: I0313 10:49:01.878831 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.879162 master-0 kubenswrapper[17876]: I0313 10:49:01.878884 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.879162 master-0 kubenswrapper[17876]: I0313 10:49:01.879022 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.879162 master-0 kubenswrapper[17876]: I0313 10:49:01.879119 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.879162 master-0 kubenswrapper[17876]: I0313 10:49:01.879142 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:01.896819 master-0 kubenswrapper[17876]: I0313 10:49:01.896750 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access\") pod \"installer-5-master-0\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:02.053004 master-0 kubenswrapper[17876]: I0313 10:49:02.052800 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:02.526641 master-0 kubenswrapper[17876]: I0313 10:49:02.526580 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-5-master-0"] Mar 13 10:49:02.531132 master-0 kubenswrapper[17876]: W0313 10:49:02.530050 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod643dd13f_bd5e_432a_98dc_26ef29a54238.slice/crio-a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777 WatchSource:0}: Error finding container a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777: Status 404 returned error can't find the container with id a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777 Mar 13 10:49:02.726509 master-0 kubenswrapper[17876]: I0313 10:49:02.726428 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"643dd13f-bd5e-432a-98dc-26ef29a54238","Type":"ContainerStarted","Data":"a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777"} Mar 13 10:49:04.746425 master-0 kubenswrapper[17876]: I0313 10:49:04.746312 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"643dd13f-bd5e-432a-98dc-26ef29a54238","Type":"ContainerStarted","Data":"01d105a3b174b856c1704b737743a28c32aa60f808973e3b6f3a7fdca8cc3ec0"} Mar 13 10:49:04.765005 master-0 kubenswrapper[17876]: I0313 10:49:04.764906 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-5-master-0" podStartSLOduration=3.764887188 podStartE2EDuration="3.764887188s" podCreationTimestamp="2026-03-13 10:49:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:49:04.764132966 +0000 UTC m=+452.599939442" watchObservedRunningTime="2026-03-13 10:49:04.764887188 +0000 UTC m=+452.600693664" Mar 13 10:49:05.053302 master-0 kubenswrapper[17876]: I0313 10:49:05.053077 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-5ccf58899f-qzj28" podUID="f8190215-7d9b-4508-a5cb-cee577d23254" containerName="console" containerID="cri-o://2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea" gracePeriod=15 Mar 13 10:49:05.557649 master-0 kubenswrapper[17876]: I0313 10:49:05.557570 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5ccf58899f-qzj28_f8190215-7d9b-4508-a5cb-cee577d23254/console/0.log" Mar 13 10:49:05.557649 master-0 kubenswrapper[17876]: I0313 10:49:05.557648 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:49:05.633346 master-0 kubenswrapper[17876]: I0313 10:49:05.633274 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8xtz\" (UniqueName: \"kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.633684 master-0 kubenswrapper[17876]: I0313 10:49:05.633382 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.633684 master-0 kubenswrapper[17876]: I0313 10:49:05.633443 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.633684 master-0 kubenswrapper[17876]: I0313 10:49:05.633485 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.633684 master-0 kubenswrapper[17876]: I0313 10:49:05.633679 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.634058 master-0 kubenswrapper[17876]: I0313 10:49:05.633719 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.634058 master-0 kubenswrapper[17876]: I0313 10:49:05.633768 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert\") pod \"f8190215-7d9b-4508-a5cb-cee577d23254\" (UID: \"f8190215-7d9b-4508-a5cb-cee577d23254\") " Mar 13 10:49:05.634058 master-0 kubenswrapper[17876]: I0313 10:49:05.633898 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca" (OuterVolumeSpecName: "service-ca") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:05.634339 master-0 kubenswrapper[17876]: I0313 10:49:05.634311 17876 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.634428 master-0 kubenswrapper[17876]: I0313 10:49:05.634395 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:05.634921 master-0 kubenswrapper[17876]: I0313 10:49:05.634874 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:05.635038 master-0 kubenswrapper[17876]: I0313 10:49:05.635002 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config" (OuterVolumeSpecName: "console-config") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:05.636409 master-0 kubenswrapper[17876]: I0313 10:49:05.636377 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:49:05.636788 master-0 kubenswrapper[17876]: I0313 10:49:05.636753 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz" (OuterVolumeSpecName: "kube-api-access-r8xtz") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "kube-api-access-r8xtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:05.636934 master-0 kubenswrapper[17876]: I0313 10:49:05.636900 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f8190215-7d9b-4508-a5cb-cee577d23254" (UID: "f8190215-7d9b-4508-a5cb-cee577d23254"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:49:05.736324 master-0 kubenswrapper[17876]: I0313 10:49:05.736263 17876 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.736324 master-0 kubenswrapper[17876]: I0313 10:49:05.736316 17876 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.736324 master-0 kubenswrapper[17876]: I0313 10:49:05.736334 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8xtz\" (UniqueName: \"kubernetes.io/projected/f8190215-7d9b-4508-a5cb-cee577d23254-kube-api-access-r8xtz\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.736324 master-0 kubenswrapper[17876]: I0313 10:49:05.736346 17876 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.736324 master-0 kubenswrapper[17876]: I0313 10:49:05.736358 17876 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f8190215-7d9b-4508-a5cb-cee577d23254-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.737026 master-0 kubenswrapper[17876]: I0313 10:49:05.736371 17876 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f8190215-7d9b-4508-a5cb-cee577d23254-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:05.756044 master-0 kubenswrapper[17876]: I0313 10:49:05.755977 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-5ccf58899f-qzj28_f8190215-7d9b-4508-a5cb-cee577d23254/console/0.log" Mar 13 10:49:05.756792 master-0 kubenswrapper[17876]: I0313 10:49:05.756054 17876 generic.go:334] "Generic (PLEG): container finished" podID="f8190215-7d9b-4508-a5cb-cee577d23254" containerID="2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea" exitCode=2 Mar 13 10:49:05.756792 master-0 kubenswrapper[17876]: I0313 10:49:05.756377 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccf58899f-qzj28" event={"ID":"f8190215-7d9b-4508-a5cb-cee577d23254","Type":"ContainerDied","Data":"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea"} Mar 13 10:49:05.756792 master-0 kubenswrapper[17876]: I0313 10:49:05.756427 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5ccf58899f-qzj28" event={"ID":"f8190215-7d9b-4508-a5cb-cee577d23254","Type":"ContainerDied","Data":"eef66ca75f37f7529919cca71023c61036beb37d3823b688f99f9ce2719e844e"} Mar 13 10:49:05.756792 master-0 kubenswrapper[17876]: I0313 10:49:05.756427 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5ccf58899f-qzj28" Mar 13 10:49:05.756792 master-0 kubenswrapper[17876]: I0313 10:49:05.756454 17876 scope.go:117] "RemoveContainer" containerID="2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea" Mar 13 10:49:05.783550 master-0 kubenswrapper[17876]: I0313 10:49:05.783416 17876 scope.go:117] "RemoveContainer" containerID="2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea" Mar 13 10:49:05.784125 master-0 kubenswrapper[17876]: E0313 10:49:05.784028 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea\": container with ID starting with 2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea not found: ID does not exist" containerID="2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea" Mar 13 10:49:05.784214 master-0 kubenswrapper[17876]: I0313 10:49:05.784173 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea"} err="failed to get container status \"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea\": rpc error: code = NotFound desc = could not find container \"2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea\": container with ID starting with 2e247d6d9dcc1e0b285a0aa9a4ecdf927393d65af6870a4c096caf404c6f86ea not found: ID does not exist" Mar 13 10:49:05.800359 master-0 kubenswrapper[17876]: I0313 10:49:05.800276 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:49:05.806897 master-0 kubenswrapper[17876]: I0313 10:49:05.806837 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-5ccf58899f-qzj28"] Mar 13 10:49:06.506419 master-0 kubenswrapper[17876]: I0313 10:49:06.506327 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f8190215-7d9b-4508-a5cb-cee577d23254" path="/var/lib/kubelet/pods/f8190215-7d9b-4508-a5cb-cee577d23254/volumes" Mar 13 10:49:13.028386 master-0 kubenswrapper[17876]: I0313 10:49:13.028161 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-retry-1-master-0"] Mar 13 10:49:13.029758 master-0 kubenswrapper[17876]: E0313 10:49:13.028643 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f8190215-7d9b-4508-a5cb-cee577d23254" containerName="console" Mar 13 10:49:13.029758 master-0 kubenswrapper[17876]: I0313 10:49:13.028666 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="f8190215-7d9b-4508-a5cb-cee577d23254" containerName="console" Mar 13 10:49:13.029758 master-0 kubenswrapper[17876]: I0313 10:49:13.028958 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="f8190215-7d9b-4508-a5cb-cee577d23254" containerName="console" Mar 13 10:49:13.029758 master-0 kubenswrapper[17876]: I0313 10:49:13.029711 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.032692 master-0 kubenswrapper[17876]: I0313 10:49:13.032620 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 10:49:13.037632 master-0 kubenswrapper[17876]: I0313 10:49:13.034254 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-h29h2" Mar 13 10:49:13.046742 master-0 kubenswrapper[17876]: I0313 10:49:13.045889 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-retry-1-master-0"] Mar 13 10:49:13.144814 master-0 kubenswrapper[17876]: I0313 10:49:13.144691 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.145115 master-0 kubenswrapper[17876]: I0313 10:49:13.144978 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.145115 master-0 kubenswrapper[17876]: I0313 10:49:13.145043 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.246688 master-0 kubenswrapper[17876]: I0313 10:49:13.246640 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.246962 master-0 kubenswrapper[17876]: I0313 10:49:13.246948 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.249071 master-0 kubenswrapper[17876]: I0313 10:49:13.247064 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.249360 master-0 kubenswrapper[17876]: I0313 10:49:13.249341 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.249533 master-0 kubenswrapper[17876]: I0313 10:49:13.249517 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.265488 master-0 kubenswrapper[17876]: I0313 10:49:13.265441 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access\") pod \"installer-2-retry-1-master-0\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.359569 master-0 kubenswrapper[17876]: I0313 10:49:13.359438 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:13.826350 master-0 kubenswrapper[17876]: I0313 10:49:13.826233 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-retry-1-master-0"] Mar 13 10:49:13.835674 master-0 kubenswrapper[17876]: W0313 10:49:13.835600 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podcf18cf27_3f9c_4592_ad68_88ac6564cb0c.slice/crio-dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd WatchSource:0}: Error finding container dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd: Status 404 returned error can't find the container with id dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd Mar 13 10:49:14.822837 master-0 kubenswrapper[17876]: I0313 10:49:14.822741 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-1-master-0" event={"ID":"cf18cf27-3f9c-4592-ad68-88ac6564cb0c","Type":"ContainerStarted","Data":"ad8ac8059240f52b2b199b60f4467f719410ccae3b4eebc43faf0a95523c1f04"} Mar 13 10:49:14.822837 master-0 kubenswrapper[17876]: I0313 10:49:14.822846 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-1-master-0" event={"ID":"cf18cf27-3f9c-4592-ad68-88ac6564cb0c","Type":"ContainerStarted","Data":"dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd"} Mar 13 10:49:14.845868 master-0 kubenswrapper[17876]: I0313 10:49:14.845757 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-retry-1-master-0" podStartSLOduration=1.845737512 podStartE2EDuration="1.845737512s" podCreationTimestamp="2026-03-13 10:49:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:49:14.843858279 +0000 UTC m=+462.679664805" watchObservedRunningTime="2026-03-13 10:49:14.845737512 +0000 UTC m=+462.681543988" Mar 13 10:49:15.754233 master-0 kubenswrapper[17876]: I0313 10:49:15.753827 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-bfb55f4b6-qf9q7" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" containerID="cri-o://9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af" gracePeriod=15 Mar 13 10:49:16.302701 master-0 kubenswrapper[17876]: I0313 10:49:16.302510 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfb55f4b6-qf9q7_9bf54984-47df-48ea-861b-9d6546c0f82b/console/0.log" Mar 13 10:49:16.302701 master-0 kubenswrapper[17876]: I0313 10:49:16.302589 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:49:16.409121 master-0 kubenswrapper[17876]: I0313 10:49:16.409018 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c56zq\" (UniqueName: \"kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409121 master-0 kubenswrapper[17876]: I0313 10:49:16.409112 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409422 master-0 kubenswrapper[17876]: I0313 10:49:16.409155 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409422 master-0 kubenswrapper[17876]: I0313 10:49:16.409277 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409422 master-0 kubenswrapper[17876]: I0313 10:49:16.409406 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409589 master-0 kubenswrapper[17876]: I0313 10:49:16.409496 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.409589 master-0 kubenswrapper[17876]: I0313 10:49:16.409528 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca\") pod \"9bf54984-47df-48ea-861b-9d6546c0f82b\" (UID: \"9bf54984-47df-48ea-861b-9d6546c0f82b\") " Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.410745 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca" (OuterVolumeSpecName: "service-ca") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.411255 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.411906 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.412526 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config" (OuterVolumeSpecName: "console-config") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.412856 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq" (OuterVolumeSpecName: "kube-api-access-c56zq") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "kube-api-access-c56zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.414471 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:49:16.419215 master-0 kubenswrapper[17876]: I0313 10:49:16.414598 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "9bf54984-47df-48ea-861b-9d6546c0f82b" (UID: "9bf54984-47df-48ea-861b-9d6546c0f82b"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:49:16.512992 master-0 kubenswrapper[17876]: I0313 10:49:16.512906 17876 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-console-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.512992 master-0 kubenswrapper[17876]: I0313 10:49:16.512981 17876 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.512992 master-0 kubenswrapper[17876]: I0313 10:49:16.512998 17876 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/9bf54984-47df-48ea-861b-9d6546c0f82b-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.512992 master-0 kubenswrapper[17876]: I0313 10:49:16.513010 17876 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.513379 master-0 kubenswrapper[17876]: I0313 10:49:16.513023 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c56zq\" (UniqueName: \"kubernetes.io/projected/9bf54984-47df-48ea-861b-9d6546c0f82b-kube-api-access-c56zq\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.513379 master-0 kubenswrapper[17876]: I0313 10:49:16.513034 17876 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.513379 master-0 kubenswrapper[17876]: I0313 10:49:16.513048 17876 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9bf54984-47df-48ea-861b-9d6546c0f82b-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:16.844857 master-0 kubenswrapper[17876]: I0313 10:49:16.844759 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-bfb55f4b6-qf9q7_9bf54984-47df-48ea-861b-9d6546c0f82b/console/0.log" Mar 13 10:49:16.845111 master-0 kubenswrapper[17876]: I0313 10:49:16.844894 17876 generic.go:334] "Generic (PLEG): container finished" podID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerID="9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af" exitCode=2 Mar 13 10:49:16.845111 master-0 kubenswrapper[17876]: I0313 10:49:16.844928 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfb55f4b6-qf9q7" event={"ID":"9bf54984-47df-48ea-861b-9d6546c0f82b","Type":"ContainerDied","Data":"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af"} Mar 13 10:49:16.845111 master-0 kubenswrapper[17876]: I0313 10:49:16.844956 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-bfb55f4b6-qf9q7" event={"ID":"9bf54984-47df-48ea-861b-9d6546c0f82b","Type":"ContainerDied","Data":"2817077f8bd8850602e04e02af065be1388aff47b0e480ea6a5c96c2be065880"} Mar 13 10:49:16.845111 master-0 kubenswrapper[17876]: I0313 10:49:16.844973 17876 scope.go:117] "RemoveContainer" containerID="9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af" Mar 13 10:49:16.845238 master-0 kubenswrapper[17876]: I0313 10:49:16.845130 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-bfb55f4b6-qf9q7" Mar 13 10:49:16.879672 master-0 kubenswrapper[17876]: I0313 10:49:16.879593 17876 scope.go:117] "RemoveContainer" containerID="9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af" Mar 13 10:49:16.880153 master-0 kubenswrapper[17876]: E0313 10:49:16.880070 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af\": container with ID starting with 9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af not found: ID does not exist" containerID="9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af" Mar 13 10:49:16.880226 master-0 kubenswrapper[17876]: I0313 10:49:16.880171 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af"} err="failed to get container status \"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af\": rpc error: code = NotFound desc = could not find container \"9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af\": container with ID starting with 9ad1324225ef79d91f9aa27368ce5d7de8ab57e3e942a59dbfc820ce1b7b91af not found: ID does not exist" Mar 13 10:49:16.884134 master-0 kubenswrapper[17876]: I0313 10:49:16.884042 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:49:16.897636 master-0 kubenswrapper[17876]: I0313 10:49:16.897558 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-bfb55f4b6-qf9q7"] Mar 13 10:49:18.504898 master-0 kubenswrapper[17876]: I0313 10:49:18.504810 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" path="/var/lib/kubelet/pods/9bf54984-47df-48ea-861b-9d6546c0f82b/volumes" Mar 13 10:49:24.127488 master-0 kubenswrapper[17876]: I0313 10:49:24.127375 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 10:49:24.129086 master-0 kubenswrapper[17876]: E0313 10:49:24.128085 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" Mar 13 10:49:24.129086 master-0 kubenswrapper[17876]: I0313 10:49:24.128157 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" Mar 13 10:49:24.129086 master-0 kubenswrapper[17876]: I0313 10:49:24.128565 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bf54984-47df-48ea-861b-9d6546c0f82b" containerName="console" Mar 13 10:49:24.129682 master-0 kubenswrapper[17876]: I0313 10:49:24.129609 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.134207 master-0 kubenswrapper[17876]: I0313 10:49:24.134136 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wbwm2" Mar 13 10:49:24.134811 master-0 kubenswrapper[17876]: I0313 10:49:24.134749 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 10:49:24.147290 master-0 kubenswrapper[17876]: I0313 10:49:24.143243 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 10:49:24.304030 master-0 kubenswrapper[17876]: I0313 10:49:24.303913 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.304403 master-0 kubenswrapper[17876]: I0313 10:49:24.304182 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.304403 master-0 kubenswrapper[17876]: I0313 10:49:24.304315 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.406447 master-0 kubenswrapper[17876]: I0313 10:49:24.406261 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.406447 master-0 kubenswrapper[17876]: I0313 10:49:24.406354 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.406447 master-0 kubenswrapper[17876]: I0313 10:49:24.406397 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.406748 master-0 kubenswrapper[17876]: I0313 10:49:24.406708 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.406810 master-0 kubenswrapper[17876]: I0313 10:49:24.406753 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.428273 master-0 kubenswrapper[17876]: I0313 10:49:24.428205 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access\") pod \"installer-3-master-0\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.474849 master-0 kubenswrapper[17876]: I0313 10:49:24.474724 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:24.915025 master-0 kubenswrapper[17876]: I0313 10:49:24.914944 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 13 10:49:25.927390 master-0 kubenswrapper[17876]: I0313 10:49:25.927279 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f","Type":"ContainerStarted","Data":"b2fd05fc9a11fcab9e7bbac5f4059065fa0a4629a30ce979706fe605af5f593a"} Mar 13 10:49:25.927390 master-0 kubenswrapper[17876]: I0313 10:49:25.927346 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f","Type":"ContainerStarted","Data":"eda8d2a5dc35361bda6f85688598b6f99af96886aa879a478f990b99840a528c"} Mar 13 10:49:25.947923 master-0 kubenswrapper[17876]: I0313 10:49:25.947822 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=1.947800196 podStartE2EDuration="1.947800196s" podCreationTimestamp="2026-03-13 10:49:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:49:25.94690218 +0000 UTC m=+473.782708676" watchObservedRunningTime="2026-03-13 10:49:25.947800196 +0000 UTC m=+473.783606672" Mar 13 10:49:26.654304 master-0 kubenswrapper[17876]: I0313 10:49:26.654246 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl"] Mar 13 10:49:26.655576 master-0 kubenswrapper[17876]: I0313 10:49:26.655552 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.668905 master-0 kubenswrapper[17876]: I0313 10:49:26.668859 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl"] Mar 13 10:49:26.750489 master-0 kubenswrapper[17876]: I0313 10:49:26.750391 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.750708 master-0 kubenswrapper[17876]: I0313 10:49:26.750495 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dxf7w\" (UniqueName: \"kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.750708 master-0 kubenswrapper[17876]: I0313 10:49:26.750587 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.852712 master-0 kubenswrapper[17876]: I0313 10:49:26.852652 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dxf7w\" (UniqueName: \"kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.853178 master-0 kubenswrapper[17876]: I0313 10:49:26.853118 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.853446 master-0 kubenswrapper[17876]: I0313 10:49:26.853416 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.853754 master-0 kubenswrapper[17876]: I0313 10:49:26.853719 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.854294 master-0 kubenswrapper[17876]: I0313 10:49:26.854261 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.874154 master-0 kubenswrapper[17876]: I0313 10:49:26.874078 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dxf7w\" (UniqueName: \"kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:26.973174 master-0 kubenswrapper[17876]: I0313 10:49:26.972976 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:27.445490 master-0 kubenswrapper[17876]: I0313 10:49:27.445420 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl"] Mar 13 10:49:27.942841 master-0 kubenswrapper[17876]: I0313 10:49:27.942760 17876 generic.go:334] "Generic (PLEG): container finished" podID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerID="af4111edc3c4dea227d6abb36d04acf10733745162fffe00d6e5db6853bb8ce7" exitCode=0 Mar 13 10:49:27.942841 master-0 kubenswrapper[17876]: I0313 10:49:27.942838 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" event={"ID":"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6","Type":"ContainerDied","Data":"af4111edc3c4dea227d6abb36d04acf10733745162fffe00d6e5db6853bb8ce7"} Mar 13 10:49:27.943153 master-0 kubenswrapper[17876]: I0313 10:49:27.942883 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" event={"ID":"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6","Type":"ContainerStarted","Data":"6c8171c1a2cd1ea08e3d34d655f04ac47733e066b4402fe85e929eac0ebf471a"} Mar 13 10:49:29.914242 master-0 kubenswrapper[17876]: I0313 10:49:29.914086 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:49:29.915068 master-0 kubenswrapper[17876]: I0313 10:49:29.914586 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" containerID="cri-o://da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" gracePeriod=30 Mar 13 10:49:29.915068 master-0 kubenswrapper[17876]: I0313 10:49:29.914791 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" containerID="cri-o://68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" gracePeriod=30 Mar 13 10:49:29.915068 master-0 kubenswrapper[17876]: I0313 10:49:29.914869 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" containerID="cri-o://1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" gracePeriod=30 Mar 13 10:49:29.922521 master-0 kubenswrapper[17876]: I0313 10:49:29.922366 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:49:29.922860 master-0 kubenswrapper[17876]: E0313 10:49:29.922804 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 13 10:49:29.922860 master-0 kubenswrapper[17876]: I0313 10:49:29.922841 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 13 10:49:29.922860 master-0 kubenswrapper[17876]: E0313 10:49:29.922858 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 13 10:49:29.923079 master-0 kubenswrapper[17876]: I0313 10:49:29.922870 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 13 10:49:29.923079 master-0 kubenswrapper[17876]: E0313 10:49:29.922889 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 13 10:49:29.923079 master-0 kubenswrapper[17876]: I0313 10:49:29.922900 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 13 10:49:29.923079 master-0 kubenswrapper[17876]: E0313 10:49:29.922919 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 13 10:49:29.923079 master-0 kubenswrapper[17876]: I0313 10:49:29.922931 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 13 10:49:29.923471 master-0 kubenswrapper[17876]: I0313 10:49:29.923163 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 13 10:49:29.923471 master-0 kubenswrapper[17876]: I0313 10:49:29.923193 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 13 10:49:29.923471 master-0 kubenswrapper[17876]: I0313 10:49:29.923207 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 13 10:49:29.923471 master-0 kubenswrapper[17876]: I0313 10:49:29.923248 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 13 10:49:29.963634 master-0 kubenswrapper[17876]: I0313 10:49:29.963581 17876 generic.go:334] "Generic (PLEG): container finished" podID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerID="bd3e779ba930ff68951abc0db28c50027b8fba87ffaccf616f42160d70d39940" exitCode=0 Mar 13 10:49:29.963761 master-0 kubenswrapper[17876]: I0313 10:49:29.963647 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" event={"ID":"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6","Type":"ContainerDied","Data":"bd3e779ba930ff68951abc0db28c50027b8fba87ffaccf616f42160d70d39940"} Mar 13 10:49:29.996477 master-0 kubenswrapper[17876]: I0313 10:49:29.996318 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 13 10:49:30.006167 master-0 kubenswrapper[17876]: I0313 10:49:30.006115 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.006419 master-0 kubenswrapper[17876]: I0313 10:49:30.006201 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.108000 master-0 kubenswrapper[17876]: I0313 10:49:30.107941 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.108141 master-0 kubenswrapper[17876]: I0313 10:49:30.108086 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.108181 master-0 kubenswrapper[17876]: I0313 10:49:30.108149 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.108345 master-0 kubenswrapper[17876]: I0313 10:49:30.108289 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.156333 master-0 kubenswrapper[17876]: I0313 10:49:30.156184 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 13 10:49:30.157616 master-0 kubenswrapper[17876]: I0313 10:49:30.157566 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.161704 master-0 kubenswrapper[17876]: I0313 10:49:30.161632 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 13 10:49:30.209117 master-0 kubenswrapper[17876]: I0313 10:49:30.209062 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 13 10:49:30.209228 master-0 kubenswrapper[17876]: I0313 10:49:30.209159 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 13 10:49:30.209442 master-0 kubenswrapper[17876]: I0313 10:49:30.209361 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:30.209604 master-0 kubenswrapper[17876]: I0313 10:49:30.209479 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:30.210114 master-0 kubenswrapper[17876]: I0313 10:49:30.210037 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:30.210231 master-0 kubenswrapper[17876]: I0313 10:49:30.210184 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:30.508521 master-0 kubenswrapper[17876]: I0313 10:49:30.508380 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3d45b6ce1b3764f9927e623a71adf8" path="/var/lib/kubelet/pods/1d3d45b6ce1b3764f9927e623a71adf8/volumes" Mar 13 10:49:30.976729 master-0 kubenswrapper[17876]: I0313 10:49:30.976669 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 13 10:49:30.977968 master-0 kubenswrapper[17876]: I0313 10:49:30.977921 17876 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" exitCode=0 Mar 13 10:49:30.978017 master-0 kubenswrapper[17876]: I0313 10:49:30.977974 17876 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" exitCode=2 Mar 13 10:49:30.978017 master-0 kubenswrapper[17876]: I0313 10:49:30.977983 17876 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" exitCode=0 Mar 13 10:49:30.978017 master-0 kubenswrapper[17876]: I0313 10:49:30.978001 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:30.978142 master-0 kubenswrapper[17876]: I0313 10:49:30.978062 17876 scope.go:117] "RemoveContainer" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" Mar 13 10:49:30.981107 master-0 kubenswrapper[17876]: I0313 10:49:30.981034 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 13 10:49:30.985108 master-0 kubenswrapper[17876]: I0313 10:49:30.985021 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 13 10:49:30.985206 master-0 kubenswrapper[17876]: I0313 10:49:30.985151 17876 generic.go:334] "Generic (PLEG): container finished" podID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerID="76ebe45a6c776297bfb000a15b96f4c8c803a4e293f16e9f20ac73b100bdd1ba" exitCode=0 Mar 13 10:49:30.985284 master-0 kubenswrapper[17876]: I0313 10:49:30.985251 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" event={"ID":"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6","Type":"ContainerDied","Data":"76ebe45a6c776297bfb000a15b96f4c8c803a4e293f16e9f20ac73b100bdd1ba"} Mar 13 10:49:30.989031 master-0 kubenswrapper[17876]: I0313 10:49:30.988973 17876 generic.go:334] "Generic (PLEG): container finished" podID="18c635c3-af41-4465-887c-e0675fabb3e8" containerID="fce27e33f22aa35e474593b031d8023e346af89a338727d0686c327a701ed472" exitCode=0 Mar 13 10:49:30.989187 master-0 kubenswrapper[17876]: I0313 10:49:30.989041 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"18c635c3-af41-4465-887c-e0675fabb3e8","Type":"ContainerDied","Data":"fce27e33f22aa35e474593b031d8023e346af89a338727d0686c327a701ed472"} Mar 13 10:49:31.002417 master-0 kubenswrapper[17876]: I0313 10:49:31.002355 17876 scope.go:117] "RemoveContainer" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" Mar 13 10:49:31.029754 master-0 kubenswrapper[17876]: I0313 10:49:31.029701 17876 scope.go:117] "RemoveContainer" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" Mar 13 10:49:31.051520 master-0 kubenswrapper[17876]: I0313 10:49:31.051456 17876 scope.go:117] "RemoveContainer" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" Mar 13 10:49:31.075179 master-0 kubenswrapper[17876]: I0313 10:49:31.075124 17876 scope.go:117] "RemoveContainer" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" Mar 13 10:49:31.075722 master-0 kubenswrapper[17876]: E0313 10:49:31.075677 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": container with ID starting with 68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7 not found: ID does not exist" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" Mar 13 10:49:31.075805 master-0 kubenswrapper[17876]: I0313 10:49:31.075720 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7"} err="failed to get container status \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": rpc error: code = NotFound desc = could not find container \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": container with ID starting with 68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7 not found: ID does not exist" Mar 13 10:49:31.075805 master-0 kubenswrapper[17876]: I0313 10:49:31.075748 17876 scope.go:117] "RemoveContainer" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" Mar 13 10:49:31.076276 master-0 kubenswrapper[17876]: E0313 10:49:31.076165 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": container with ID starting with 1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512 not found: ID does not exist" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" Mar 13 10:49:31.076276 master-0 kubenswrapper[17876]: I0313 10:49:31.076217 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512"} err="failed to get container status \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": rpc error: code = NotFound desc = could not find container \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": container with ID starting with 1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512 not found: ID does not exist" Mar 13 10:49:31.076276 master-0 kubenswrapper[17876]: I0313 10:49:31.076249 17876 scope.go:117] "RemoveContainer" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" Mar 13 10:49:31.076584 master-0 kubenswrapper[17876]: E0313 10:49:31.076535 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": container with ID starting with da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d not found: ID does not exist" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" Mar 13 10:49:31.076647 master-0 kubenswrapper[17876]: I0313 10:49:31.076583 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d"} err="failed to get container status \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": rpc error: code = NotFound desc = could not find container \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": container with ID starting with da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d not found: ID does not exist" Mar 13 10:49:31.076647 master-0 kubenswrapper[17876]: I0313 10:49:31.076612 17876 scope.go:117] "RemoveContainer" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" Mar 13 10:49:31.076886 master-0 kubenswrapper[17876]: E0313 10:49:31.076862 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": container with ID starting with b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709 not found: ID does not exist" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" Mar 13 10:49:31.076951 master-0 kubenswrapper[17876]: I0313 10:49:31.076887 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709"} err="failed to get container status \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": rpc error: code = NotFound desc = could not find container \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": container with ID starting with b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709 not found: ID does not exist" Mar 13 10:49:31.076951 master-0 kubenswrapper[17876]: I0313 10:49:31.076902 17876 scope.go:117] "RemoveContainer" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" Mar 13 10:49:31.077240 master-0 kubenswrapper[17876]: I0313 10:49:31.077216 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7"} err="failed to get container status \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": rpc error: code = NotFound desc = could not find container \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": container with ID starting with 68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7 not found: ID does not exist" Mar 13 10:49:31.077240 master-0 kubenswrapper[17876]: I0313 10:49:31.077237 17876 scope.go:117] "RemoveContainer" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" Mar 13 10:49:31.077482 master-0 kubenswrapper[17876]: I0313 10:49:31.077460 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512"} err="failed to get container status \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": rpc error: code = NotFound desc = could not find container \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": container with ID starting with 1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512 not found: ID does not exist" Mar 13 10:49:31.077482 master-0 kubenswrapper[17876]: I0313 10:49:31.077479 17876 scope.go:117] "RemoveContainer" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" Mar 13 10:49:31.077840 master-0 kubenswrapper[17876]: I0313 10:49:31.077820 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d"} err="failed to get container status \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": rpc error: code = NotFound desc = could not find container \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": container with ID starting with da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d not found: ID does not exist" Mar 13 10:49:31.077840 master-0 kubenswrapper[17876]: I0313 10:49:31.077837 17876 scope.go:117] "RemoveContainer" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" Mar 13 10:49:31.078029 master-0 kubenswrapper[17876]: I0313 10:49:31.078010 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709"} err="failed to get container status \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": rpc error: code = NotFound desc = could not find container \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": container with ID starting with b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709 not found: ID does not exist" Mar 13 10:49:31.078029 master-0 kubenswrapper[17876]: I0313 10:49:31.078026 17876 scope.go:117] "RemoveContainer" containerID="68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7" Mar 13 10:49:31.078392 master-0 kubenswrapper[17876]: I0313 10:49:31.078369 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7"} err="failed to get container status \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": rpc error: code = NotFound desc = could not find container \"68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7\": container with ID starting with 68af3b5162cd55c04a9a2d47105283b902b4a55bb92b6269678fa976c32ce0b7 not found: ID does not exist" Mar 13 10:49:31.078392 master-0 kubenswrapper[17876]: I0313 10:49:31.078386 17876 scope.go:117] "RemoveContainer" containerID="1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512" Mar 13 10:49:31.078615 master-0 kubenswrapper[17876]: I0313 10:49:31.078589 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512"} err="failed to get container status \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": rpc error: code = NotFound desc = could not find container \"1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512\": container with ID starting with 1027c7d3d1dfe5deb10e19687f42044a54100b2932d9dbfab5b5e64c895c9512 not found: ID does not exist" Mar 13 10:49:31.078615 master-0 kubenswrapper[17876]: I0313 10:49:31.078612 17876 scope.go:117] "RemoveContainer" containerID="da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d" Mar 13 10:49:31.078956 master-0 kubenswrapper[17876]: I0313 10:49:31.078932 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d"} err="failed to get container status \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": rpc error: code = NotFound desc = could not find container \"da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d\": container with ID starting with da402f4a4d1da32e68a0dd96b0ad2a205a1062635378ba5fdd89074440dac83d not found: ID does not exist" Mar 13 10:49:31.078956 master-0 kubenswrapper[17876]: I0313 10:49:31.078952 17876 scope.go:117] "RemoveContainer" containerID="b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709" Mar 13 10:49:31.079229 master-0 kubenswrapper[17876]: I0313 10:49:31.079204 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709"} err="failed to get container status \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": rpc error: code = NotFound desc = could not find container \"b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709\": container with ID starting with b577fbd2c51e4cdd2f9d416a2fa40e636d9247eb410b98ba0adc3a33a4244709 not found: ID does not exist" Mar 13 10:49:32.357401 master-0 kubenswrapper[17876]: I0313 10:49:32.357359 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:32.443713 master-0 kubenswrapper[17876]: I0313 10:49:32.443652 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:49:32.448420 master-0 kubenswrapper[17876]: I0313 10:49:32.446143 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util\") pod \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " Mar 13 10:49:32.448767 master-0 kubenswrapper[17876]: I0313 10:49:32.448677 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle\") pod \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " Mar 13 10:49:32.448886 master-0 kubenswrapper[17876]: I0313 10:49:32.448836 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dxf7w\" (UniqueName: \"kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w\") pod \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\" (UID: \"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6\") " Mar 13 10:49:32.449758 master-0 kubenswrapper[17876]: I0313 10:49:32.449686 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle" (OuterVolumeSpecName: "bundle") pod "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" (UID: "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:49:32.451432 master-0 kubenswrapper[17876]: I0313 10:49:32.451383 17876 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:32.453688 master-0 kubenswrapper[17876]: I0313 10:49:32.453578 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w" (OuterVolumeSpecName: "kube-api-access-dxf7w") pod "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" (UID: "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6"). InnerVolumeSpecName "kube-api-access-dxf7w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:32.466131 master-0 kubenswrapper[17876]: I0313 10:49:32.466043 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util" (OuterVolumeSpecName: "util") pod "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" (UID: "dd54f5ca-f4c2-4510-bde9-d84bce93d7d6"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:49:32.552933 master-0 kubenswrapper[17876]: I0313 10:49:32.552888 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock\") pod \"18c635c3-af41-4465-887c-e0675fabb3e8\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " Mar 13 10:49:32.553151 master-0 kubenswrapper[17876]: I0313 10:49:32.552949 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock" (OuterVolumeSpecName: "var-lock") pod "18c635c3-af41-4465-887c-e0675fabb3e8" (UID: "18c635c3-af41-4465-887c-e0675fabb3e8"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:32.553151 master-0 kubenswrapper[17876]: I0313 10:49:32.552978 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir\") pod \"18c635c3-af41-4465-887c-e0675fabb3e8\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " Mar 13 10:49:32.553151 master-0 kubenswrapper[17876]: I0313 10:49:32.553008 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "18c635c3-af41-4465-887c-e0675fabb3e8" (UID: "18c635c3-af41-4465-887c-e0675fabb3e8"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:32.553347 master-0 kubenswrapper[17876]: I0313 10:49:32.553229 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access\") pod \"18c635c3-af41-4465-887c-e0675fabb3e8\" (UID: \"18c635c3-af41-4465-887c-e0675fabb3e8\") " Mar 13 10:49:32.553743 master-0 kubenswrapper[17876]: I0313 10:49:32.553711 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dxf7w\" (UniqueName: \"kubernetes.io/projected/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-kube-api-access-dxf7w\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:32.553743 master-0 kubenswrapper[17876]: I0313 10:49:32.553735 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:32.553911 master-0 kubenswrapper[17876]: I0313 10:49:32.553748 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/18c635c3-af41-4465-887c-e0675fabb3e8-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:32.553911 master-0 kubenswrapper[17876]: I0313 10:49:32.553780 17876 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/dd54f5ca-f4c2-4510-bde9-d84bce93d7d6-util\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:32.558316 master-0 kubenswrapper[17876]: I0313 10:49:32.558275 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "18c635c3-af41-4465-887c-e0675fabb3e8" (UID: "18c635c3-af41-4465-887c-e0675fabb3e8"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:32.655984 master-0 kubenswrapper[17876]: I0313 10:49:32.655713 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/18c635c3-af41-4465-887c-e0675fabb3e8-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:33.008401 master-0 kubenswrapper[17876]: I0313 10:49:33.007990 17876 scope.go:117] "RemoveContainer" containerID="ff58356cafd17211ab03ac0b3de2df04e88ec6642de92ac89ae8e6565eaf0c07" Mar 13 10:49:33.021632 master-0 kubenswrapper[17876]: I0313 10:49:33.021511 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" event={"ID":"dd54f5ca-f4c2-4510-bde9-d84bce93d7d6","Type":"ContainerDied","Data":"6c8171c1a2cd1ea08e3d34d655f04ac47733e066b4402fe85e929eac0ebf471a"} Mar 13 10:49:33.021632 master-0 kubenswrapper[17876]: I0313 10:49:33.021632 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c8171c1a2cd1ea08e3d34d655f04ac47733e066b4402fe85e929eac0ebf471a" Mar 13 10:49:33.022026 master-0 kubenswrapper[17876]: I0313 10:49:33.021574 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4p4vkl" Mar 13 10:49:33.025611 master-0 kubenswrapper[17876]: I0313 10:49:33.025559 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"18c635c3-af41-4465-887c-e0675fabb3e8","Type":"ContainerDied","Data":"b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258"} Mar 13 10:49:33.025712 master-0 kubenswrapper[17876]: I0313 10:49:33.025624 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b421f77276d2afbb7bebeb8a2ed1a90e991f125fbf72adf0463cc5c759314258" Mar 13 10:49:33.025712 master-0 kubenswrapper[17876]: I0313 10:49:33.025624 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 13 10:49:41.751841 master-0 kubenswrapper[17876]: I0313 10:49:41.751717 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: E0313 10:49:41.752671 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="extract" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: I0313 10:49:41.752733 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="extract" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: E0313 10:49:41.752759 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="pull" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: I0313 10:49:41.752766 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="pull" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: E0313 10:49:41.752777 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="util" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: I0313 10:49:41.752783 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="util" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: E0313 10:49:41.752818 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="18c635c3-af41-4465-887c-e0675fabb3e8" containerName="installer" Mar 13 10:49:41.752834 master-0 kubenswrapper[17876]: I0313 10:49:41.752825 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="18c635c3-af41-4465-887c-e0675fabb3e8" containerName="installer" Mar 13 10:49:41.753317 master-0 kubenswrapper[17876]: I0313 10:49:41.753037 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="18c635c3-af41-4465-887c-e0675fabb3e8" containerName="installer" Mar 13 10:49:41.753317 master-0 kubenswrapper[17876]: I0313 10:49:41.753062 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd54f5ca-f4c2-4510-bde9-d84bce93d7d6" containerName="extract" Mar 13 10:49:41.753745 master-0 kubenswrapper[17876]: I0313 10:49:41.753707 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.759279 master-0 kubenswrapper[17876]: I0313 10:49:41.759207 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:49:41.759407 master-0 kubenswrapper[17876]: I0313 10:49:41.759334 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:49:41.759772 master-0 kubenswrapper[17876]: I0313 10:49:41.759689 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver" containerID="cri-o://342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" gracePeriod=15 Mar 13 10:49:41.760134 master-0 kubenswrapper[17876]: E0313 10:49:41.760108 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-syncer" Mar 13 10:49:41.760134 master-0 kubenswrapper[17876]: I0313 10:49:41.760128 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-syncer" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: E0313 10:49:41.760149 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="setup" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: I0313 10:49:41.760156 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="setup" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: E0313 10:49:41.760176 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: I0313 10:49:41.760182 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: E0313 10:49:41.760192 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:49:41.760213 master-0 kubenswrapper[17876]: I0313 10:49:41.760198 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: E0313 10:49:41.760220 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-insecure-readyz" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: I0313 10:49:41.760227 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-insecure-readyz" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: E0313 10:49:41.760235 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-check-endpoints" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: I0313 10:49:41.760242 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-check-endpoints" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: I0313 10:49:41.760370 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-check-endpoints" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: I0313 10:49:41.760384 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver" Mar 13 10:49:41.760391 master-0 kubenswrapper[17876]: I0313 10:49:41.760392 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-insecure-readyz" Mar 13 10:49:41.760594 master-0 kubenswrapper[17876]: I0313 10:49:41.760405 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-regeneration-controller" Mar 13 10:49:41.760594 master-0 kubenswrapper[17876]: I0313 10:49:41.760419 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-syncer" Mar 13 10:49:41.760759 master-0 kubenswrapper[17876]: I0313 10:49:41.760663 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-check-endpoints" containerID="cri-o://2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda" gracePeriod=15 Mar 13 10:49:41.760881 master-0 kubenswrapper[17876]: I0313 10:49:41.760772 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab" gracePeriod=15 Mar 13 10:49:41.763991 master-0 kubenswrapper[17876]: I0313 10:49:41.761004 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a" gracePeriod=15 Mar 13 10:49:41.763991 master-0 kubenswrapper[17876]: I0313 10:49:41.761042 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="48512e02022680c9d90092634f0fc146" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77" gracePeriod=15 Mar 13 10:49:41.928702 master-0 kubenswrapper[17876]: I0313 10:49:41.928656 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.928802 master-0 kubenswrapper[17876]: I0313 10:49:41.928713 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.928845 master-0 kubenswrapper[17876]: I0313 10:49:41.928817 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:41.928883 master-0 kubenswrapper[17876]: I0313 10:49:41.928861 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.929075 master-0 kubenswrapper[17876]: I0313 10:49:41.928912 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:41.930263 master-0 kubenswrapper[17876]: I0313 10:49:41.929081 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:41.930263 master-0 kubenswrapper[17876]: I0313 10:49:41.929119 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.930263 master-0 kubenswrapper[17876]: I0313 10:49:41.929144 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:41.949118 master-0 kubenswrapper[17876]: E0313 10:49:41.948867 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.030950 master-0 kubenswrapper[17876]: I0313 10:49:42.030790 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.030950 master-0 kubenswrapper[17876]: I0313 10:49:42.030869 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.031427 master-0 kubenswrapper[17876]: I0313 10:49:42.031385 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.031512 master-0 kubenswrapper[17876]: I0313 10:49:42.031471 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.031594 master-0 kubenswrapper[17876]: I0313 10:49:42.031525 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.031681 master-0 kubenswrapper[17876]: I0313 10:49:42.031632 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.031764 master-0 kubenswrapper[17876]: I0313 10:49:42.031730 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.031856 master-0 kubenswrapper[17876]: I0313 10:49:42.031829 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/36d4251d3504cdc0ec85144c1379056c-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"36d4251d3504cdc0ec85144c1379056c\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:42.031946 master-0 kubenswrapper[17876]: I0313 10:49:42.031923 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032026 master-0 kubenswrapper[17876]: I0313 10:49:42.032005 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032155 master-0 kubenswrapper[17876]: I0313 10:49:42.032090 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032219 master-0 kubenswrapper[17876]: I0313 10:49:42.032184 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032347 master-0 kubenswrapper[17876]: I0313 10:49:42.032308 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032435 master-0 kubenswrapper[17876]: I0313 10:49:42.032415 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032518 master-0 kubenswrapper[17876]: I0313 10:49:42.032439 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.032518 master-0 kubenswrapper[17876]: I0313 10:49:42.032417 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"a814bd60de133d95cf99630a978c017e\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.249570 master-0 kubenswrapper[17876]: I0313 10:49:42.249488 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:42.285567 master-0 kubenswrapper[17876]: E0313 10:49:42.285345 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c60fa80176c3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:49:42.28424198 +0000 UTC m=+490.120048476,LastTimestamp:2026-03-13 10:49:42.28424198 +0000 UTC m=+490.120048476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:49:42.355609 master-0 kubenswrapper[17876]: I0313 10:49:42.355557 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"85385a5990fe3ed17db36d3d0ce358b59f6631bf352bca1e469d773099dc35cb"} Mar 13 10:49:42.359059 master-0 kubenswrapper[17876]: I0313 10:49:42.358990 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_48512e02022680c9d90092634f0fc146/kube-apiserver-cert-syncer/0.log" Mar 13 10:49:42.360537 master-0 kubenswrapper[17876]: I0313 10:49:42.360498 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda" exitCode=0 Mar 13 10:49:42.360537 master-0 kubenswrapper[17876]: I0313 10:49:42.360535 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a" exitCode=0 Mar 13 10:49:42.360696 master-0 kubenswrapper[17876]: I0313 10:49:42.360544 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab" exitCode=0 Mar 13 10:49:42.360696 master-0 kubenswrapper[17876]: I0313 10:49:42.360553 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77" exitCode=2 Mar 13 10:49:42.363730 master-0 kubenswrapper[17876]: I0313 10:49:42.363637 17876 generic.go:334] "Generic (PLEG): container finished" podID="643dd13f-bd5e-432a-98dc-26ef29a54238" containerID="01d105a3b174b856c1704b737743a28c32aa60f808973e3b6f3a7fdca8cc3ec0" exitCode=0 Mar 13 10:49:42.363730 master-0 kubenswrapper[17876]: I0313 10:49:42.363684 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"643dd13f-bd5e-432a-98dc-26ef29a54238","Type":"ContainerDied","Data":"01d105a3b174b856c1704b737743a28c32aa60f808973e3b6f3a7fdca8cc3ec0"} Mar 13 10:49:42.365848 master-0 kubenswrapper[17876]: I0313 10:49:42.365781 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:42.366867 master-0 kubenswrapper[17876]: I0313 10:49:42.366795 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:42.497941 master-0 kubenswrapper[17876]: I0313 10:49:42.497678 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:42.498184 master-0 kubenswrapper[17876]: I0313 10:49:42.498038 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:43.389182 master-0 kubenswrapper[17876]: I0313 10:49:43.389055 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"a814bd60de133d95cf99630a978c017e","Type":"ContainerStarted","Data":"a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec"} Mar 13 10:49:43.390943 master-0 kubenswrapper[17876]: E0313 10:49:43.390721 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:43.391638 master-0 kubenswrapper[17876]: I0313 10:49:43.391569 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:43.935360 master-0 kubenswrapper[17876]: I0313 10:49:43.933593 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:43.936069 master-0 kubenswrapper[17876]: I0313 10:49:43.936023 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.071038 master-0 kubenswrapper[17876]: I0313 10:49:44.070904 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access\") pod \"643dd13f-bd5e-432a-98dc-26ef29a54238\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " Mar 13 10:49:44.071328 master-0 kubenswrapper[17876]: I0313 10:49:44.071060 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir\") pod \"643dd13f-bd5e-432a-98dc-26ef29a54238\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " Mar 13 10:49:44.071328 master-0 kubenswrapper[17876]: I0313 10:49:44.071132 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock\") pod \"643dd13f-bd5e-432a-98dc-26ef29a54238\" (UID: \"643dd13f-bd5e-432a-98dc-26ef29a54238\") " Mar 13 10:49:44.071328 master-0 kubenswrapper[17876]: I0313 10:49:44.071221 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "643dd13f-bd5e-432a-98dc-26ef29a54238" (UID: "643dd13f-bd5e-432a-98dc-26ef29a54238"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:44.071328 master-0 kubenswrapper[17876]: I0313 10:49:44.071261 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock" (OuterVolumeSpecName: "var-lock") pod "643dd13f-bd5e-432a-98dc-26ef29a54238" (UID: "643dd13f-bd5e-432a-98dc-26ef29a54238"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:44.071497 master-0 kubenswrapper[17876]: I0313 10:49:44.071475 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.071497 master-0 kubenswrapper[17876]: I0313 10:49:44.071492 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/643dd13f-bd5e-432a-98dc-26ef29a54238-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.078064 master-0 kubenswrapper[17876]: I0313 10:49:44.077978 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "643dd13f-bd5e-432a-98dc-26ef29a54238" (UID: "643dd13f-bd5e-432a-98dc-26ef29a54238"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:44.173086 master-0 kubenswrapper[17876]: I0313 10:49:44.172946 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/643dd13f-bd5e-432a-98dc-26ef29a54238-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.230952 master-0 kubenswrapper[17876]: I0313 10:49:44.230893 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_48512e02022680c9d90092634f0fc146/kube-apiserver-cert-syncer/0.log" Mar 13 10:49:44.232083 master-0 kubenswrapper[17876]: I0313 10:49:44.232049 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:44.233306 master-0 kubenswrapper[17876]: I0313 10:49:44.233212 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.234205 master-0 kubenswrapper[17876]: I0313 10:49:44.234140 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.374469 master-0 kubenswrapper[17876]: I0313 10:49:44.374379 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"48512e02022680c9d90092634f0fc146\" (UID: \"48512e02022680c9d90092634f0fc146\") " Mar 13 10:49:44.374795 master-0 kubenswrapper[17876]: I0313 10:49:44.374576 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"48512e02022680c9d90092634f0fc146\" (UID: \"48512e02022680c9d90092634f0fc146\") " Mar 13 10:49:44.374795 master-0 kubenswrapper[17876]: I0313 10:49:44.374554 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "48512e02022680c9d90092634f0fc146" (UID: "48512e02022680c9d90092634f0fc146"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:44.374795 master-0 kubenswrapper[17876]: I0313 10:49:44.374646 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "48512e02022680c9d90092634f0fc146" (UID: "48512e02022680c9d90092634f0fc146"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:44.374795 master-0 kubenswrapper[17876]: I0313 10:49:44.374671 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"48512e02022680c9d90092634f0fc146\" (UID: \"48512e02022680c9d90092634f0fc146\") " Mar 13 10:49:44.375068 master-0 kubenswrapper[17876]: I0313 10:49:44.374823 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "48512e02022680c9d90092634f0fc146" (UID: "48512e02022680c9d90092634f0fc146"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:44.375438 master-0 kubenswrapper[17876]: I0313 10:49:44.375381 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.375438 master-0 kubenswrapper[17876]: I0313 10:49:44.375424 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.375650 master-0 kubenswrapper[17876]: I0313 10:49:44.375445 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:44.409644 master-0 kubenswrapper[17876]: I0313 10:49:44.409557 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-5-master-0" event={"ID":"643dd13f-bd5e-432a-98dc-26ef29a54238","Type":"ContainerDied","Data":"a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777"} Mar 13 10:49:44.409644 master-0 kubenswrapper[17876]: I0313 10:49:44.409641 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a3181f4a1f201a781ad788e3441cbbbd9cedd09026968beaaaccacca84ab6777" Mar 13 10:49:44.410980 master-0 kubenswrapper[17876]: I0313 10:49:44.409697 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-5-master-0" Mar 13 10:49:44.415393 master-0 kubenswrapper[17876]: I0313 10:49:44.415314 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_48512e02022680c9d90092634f0fc146/kube-apiserver-cert-syncer/0.log" Mar 13 10:49:44.417007 master-0 kubenswrapper[17876]: I0313 10:49:44.416949 17876 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" exitCode=0 Mar 13 10:49:44.417745 master-0 kubenswrapper[17876]: I0313 10:49:44.417694 17876 scope.go:117] "RemoveContainer" containerID="2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda" Mar 13 10:49:44.418146 master-0 kubenswrapper[17876]: I0313 10:49:44.417815 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:44.418495 master-0 kubenswrapper[17876]: E0313 10:49:44.418162 17876 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:49:44.452655 master-0 kubenswrapper[17876]: I0313 10:49:44.452570 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.453533 master-0 kubenswrapper[17876]: I0313 10:49:44.453473 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.454561 master-0 kubenswrapper[17876]: I0313 10:49:44.454505 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.455615 master-0 kubenswrapper[17876]: I0313 10:49:44.455546 17876 status_manager.go:851] "Failed to get status for pod" podUID="48512e02022680c9d90092634f0fc146" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:44.456652 master-0 kubenswrapper[17876]: I0313 10:49:44.456279 17876 scope.go:117] "RemoveContainer" containerID="0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a" Mar 13 10:49:44.488710 master-0 kubenswrapper[17876]: I0313 10:49:44.488666 17876 scope.go:117] "RemoveContainer" containerID="96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab" Mar 13 10:49:44.504189 master-0 kubenswrapper[17876]: I0313 10:49:44.504149 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="48512e02022680c9d90092634f0fc146" path="/var/lib/kubelet/pods/48512e02022680c9d90092634f0fc146/volumes" Mar 13 10:49:44.506066 master-0 kubenswrapper[17876]: I0313 10:49:44.506030 17876 scope.go:117] "RemoveContainer" containerID="ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77" Mar 13 10:49:44.523169 master-0 kubenswrapper[17876]: I0313 10:49:44.523143 17876 scope.go:117] "RemoveContainer" containerID="342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" Mar 13 10:49:44.540465 master-0 kubenswrapper[17876]: I0313 10:49:44.540382 17876 scope.go:117] "RemoveContainer" containerID="ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc" Mar 13 10:49:44.556253 master-0 kubenswrapper[17876]: I0313 10:49:44.556142 17876 scope.go:117] "RemoveContainer" containerID="2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda" Mar 13 10:49:44.556901 master-0 kubenswrapper[17876]: E0313 10:49:44.556797 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda\": container with ID starting with 2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda not found: ID does not exist" containerID="2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda" Mar 13 10:49:44.556901 master-0 kubenswrapper[17876]: I0313 10:49:44.556829 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda"} err="failed to get container status \"2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda\": rpc error: code = NotFound desc = could not find container \"2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda\": container with ID starting with 2c088ed020c89cafbd541aeecd45afb557517995d35c634655e479be3e5adcda not found: ID does not exist" Mar 13 10:49:44.556901 master-0 kubenswrapper[17876]: I0313 10:49:44.556849 17876 scope.go:117] "RemoveContainer" containerID="0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a" Mar 13 10:49:44.557241 master-0 kubenswrapper[17876]: E0313 10:49:44.557199 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a\": container with ID starting with 0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a not found: ID does not exist" containerID="0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a" Mar 13 10:49:44.557321 master-0 kubenswrapper[17876]: I0313 10:49:44.557255 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a"} err="failed to get container status \"0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a\": rpc error: code = NotFound desc = could not find container \"0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a\": container with ID starting with 0fc5bd5756030ea5680c61725b16bb2e4d446971139ef9023d6f43cf1253af0a not found: ID does not exist" Mar 13 10:49:44.557321 master-0 kubenswrapper[17876]: I0313 10:49:44.557284 17876 scope.go:117] "RemoveContainer" containerID="96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab" Mar 13 10:49:44.557932 master-0 kubenswrapper[17876]: E0313 10:49:44.557838 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab\": container with ID starting with 96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab not found: ID does not exist" containerID="96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab" Mar 13 10:49:44.557932 master-0 kubenswrapper[17876]: I0313 10:49:44.557859 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab"} err="failed to get container status \"96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab\": rpc error: code = NotFound desc = could not find container \"96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab\": container with ID starting with 96ebfcf51bddec831d8a71d5c6a848e2f0a31417f95dc25888ccd40f27940cab not found: ID does not exist" Mar 13 10:49:44.557932 master-0 kubenswrapper[17876]: I0313 10:49:44.557874 17876 scope.go:117] "RemoveContainer" containerID="ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77" Mar 13 10:49:44.558302 master-0 kubenswrapper[17876]: E0313 10:49:44.558256 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77\": container with ID starting with ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77 not found: ID does not exist" containerID="ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77" Mar 13 10:49:44.558361 master-0 kubenswrapper[17876]: I0313 10:49:44.558332 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77"} err="failed to get container status \"ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77\": rpc error: code = NotFound desc = could not find container \"ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77\": container with ID starting with ecab62684688011289dd4addb951ba0c7d1d3b9dd7de40aa51da06e39fcf4a77 not found: ID does not exist" Mar 13 10:49:44.558402 master-0 kubenswrapper[17876]: I0313 10:49:44.558367 17876 scope.go:117] "RemoveContainer" containerID="342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" Mar 13 10:49:44.558739 master-0 kubenswrapper[17876]: E0313 10:49:44.558707 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f\": container with ID starting with 342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f not found: ID does not exist" containerID="342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f" Mar 13 10:49:44.558797 master-0 kubenswrapper[17876]: I0313 10:49:44.558749 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f"} err="failed to get container status \"342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f\": rpc error: code = NotFound desc = could not find container \"342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f\": container with ID starting with 342edc3ca608a512a3aa898625a8025e3cd801f8d48aae44ac91baa1ba769c2f not found: ID does not exist" Mar 13 10:49:44.558833 master-0 kubenswrapper[17876]: I0313 10:49:44.558784 17876 scope.go:117] "RemoveContainer" containerID="ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc" Mar 13 10:49:44.559129 master-0 kubenswrapper[17876]: E0313 10:49:44.559068 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc\": container with ID starting with ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc not found: ID does not exist" containerID="ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc" Mar 13 10:49:44.559229 master-0 kubenswrapper[17876]: I0313 10:49:44.559203 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc"} err="failed to get container status \"ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc\": rpc error: code = NotFound desc = could not find container \"ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc\": container with ID starting with ba66d011ba0d7ddb87307868a1376daba2f26674119cd8ab64aeb05248d9decc not found: ID does not exist" Mar 13 10:49:45.031116 master-0 kubenswrapper[17876]: E0313 10:49:45.030941 17876 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189c60fa80176c3c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:a814bd60de133d95cf99630a978c017e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:49:42.28424198 +0000 UTC m=+490.120048476,LastTimestamp:2026-03-13 10:49:42.28424198 +0000 UTC m=+490.120048476,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:49:45.428161 master-0 kubenswrapper[17876]: I0313 10:49:45.428092 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-retry-1-master-0_cf18cf27-3f9c-4592-ad68-88ac6564cb0c/installer/0.log" Mar 13 10:49:45.428636 master-0 kubenswrapper[17876]: I0313 10:49:45.428170 17876 generic.go:334] "Generic (PLEG): container finished" podID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" containerID="ad8ac8059240f52b2b199b60f4467f719410ccae3b4eebc43faf0a95523c1f04" exitCode=1 Mar 13 10:49:45.428636 master-0 kubenswrapper[17876]: I0313 10:49:45.428211 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-1-master-0" event={"ID":"cf18cf27-3f9c-4592-ad68-88ac6564cb0c","Type":"ContainerDied","Data":"ad8ac8059240f52b2b199b60f4467f719410ccae3b4eebc43faf0a95523c1f04"} Mar 13 10:49:45.430387 master-0 kubenswrapper[17876]: I0313 10:49:45.430341 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:45.430951 master-0 kubenswrapper[17876]: I0313 10:49:45.430879 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:45.493868 master-0 kubenswrapper[17876]: I0313 10:49:45.493815 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:45.496077 master-0 kubenswrapper[17876]: I0313 10:49:45.496012 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:45.500191 master-0 kubenswrapper[17876]: I0313 10:49:45.500122 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:45.510619 master-0 kubenswrapper[17876]: I0313 10:49:45.510551 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:45.511154 master-0 kubenswrapper[17876]: I0313 10:49:45.510637 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:45.511246 master-0 kubenswrapper[17876]: E0313 10:49:45.511209 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:45.512200 master-0 kubenswrapper[17876]: I0313 10:49:45.512160 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:45.597507 master-0 kubenswrapper[17876]: W0313 10:49:45.597444 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1453f6461bf5d599ad65a4656343ee91.slice/crio-894452daa6f5e71828f1d0f8559a731811f334a1798ad2bf8575c7889bdb0db5 WatchSource:0}: Error finding container 894452daa6f5e71828f1d0f8559a731811f334a1798ad2bf8575c7889bdb0db5: Status 404 returned error can't find the container with id 894452daa6f5e71828f1d0f8559a731811f334a1798ad2bf8575c7889bdb0db5 Mar 13 10:49:46.446927 master-0 kubenswrapper[17876]: I0313 10:49:46.446817 17876 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="9781a81acc7f50cb1bf44a9c6340a8a3532c1f813a7788da7274e76051848678" exitCode=0 Mar 13 10:49:46.446927 master-0 kubenswrapper[17876]: I0313 10:49:46.446901 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"9781a81acc7f50cb1bf44a9c6340a8a3532c1f813a7788da7274e76051848678"} Mar 13 10:49:46.447939 master-0 kubenswrapper[17876]: I0313 10:49:46.446981 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"894452daa6f5e71828f1d0f8559a731811f334a1798ad2bf8575c7889bdb0db5"} Mar 13 10:49:46.447939 master-0 kubenswrapper[17876]: I0313 10:49:46.447457 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:46.447939 master-0 kubenswrapper[17876]: I0313 10:49:46.447492 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:46.448373 master-0 kubenswrapper[17876]: E0313 10:49:46.448302 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:46.448373 master-0 kubenswrapper[17876]: I0313 10:49:46.448325 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:46.449346 master-0 kubenswrapper[17876]: I0313 10:49:46.449264 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:46.933282 master-0 kubenswrapper[17876]: I0313 10:49:46.933244 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-retry-1-master-0_cf18cf27-3f9c-4592-ad68-88ac6564cb0c/installer/0.log" Mar 13 10:49:46.933490 master-0 kubenswrapper[17876]: I0313 10:49:46.933319 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:46.934114 master-0 kubenswrapper[17876]: I0313 10:49:46.934040 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:46.934704 master-0 kubenswrapper[17876]: I0313 10:49:46.934664 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:47.108299 master-0 kubenswrapper[17876]: I0313 10:49:47.108229 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access\") pod \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " Mar 13 10:49:47.108299 master-0 kubenswrapper[17876]: I0313 10:49:47.108286 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock\") pod \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " Mar 13 10:49:47.108639 master-0 kubenswrapper[17876]: I0313 10:49:47.108455 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock" (OuterVolumeSpecName: "var-lock") pod "cf18cf27-3f9c-4592-ad68-88ac6564cb0c" (UID: "cf18cf27-3f9c-4592-ad68-88ac6564cb0c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:47.108639 master-0 kubenswrapper[17876]: I0313 10:49:47.108593 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir\") pod \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\" (UID: \"cf18cf27-3f9c-4592-ad68-88ac6564cb0c\") " Mar 13 10:49:47.108798 master-0 kubenswrapper[17876]: I0313 10:49:47.108745 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cf18cf27-3f9c-4592-ad68-88ac6564cb0c" (UID: "cf18cf27-3f9c-4592-ad68-88ac6564cb0c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:47.108976 master-0 kubenswrapper[17876]: I0313 10:49:47.108937 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:47.108976 master-0 kubenswrapper[17876]: I0313 10:49:47.108957 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:47.111833 master-0 kubenswrapper[17876]: I0313 10:49:47.111784 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cf18cf27-3f9c-4592-ad68-88ac6564cb0c" (UID: "cf18cf27-3f9c-4592-ad68-88ac6564cb0c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:47.213627 master-0 kubenswrapper[17876]: I0313 10:49:47.213272 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cf18cf27-3f9c-4592-ad68-88ac6564cb0c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467168 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"e2c88e4c1fc855558d16a23967e91f39d888b2b5d567204372568c3c9fe0b418"} Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467232 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"355bf3d5ec99886176b83dda427870237f037b0cd1a25b7014ba793fd220e769"} Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467247 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"9f53cc3cddb8fe9d1088b7766a1921dd54985febb851e44b5536925b781b058e"} Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467497 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467513 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:47.467606 master-0 kubenswrapper[17876]: I0313 10:49:47.467553 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:47.468314 master-0 kubenswrapper[17876]: I0313 10:49:47.468060 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: E0313 10:49:47.468118 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: I0313 10:49:47.469216 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-2-retry-1-master-0_cf18cf27-3f9c-4592-ad68-88ac6564cb0c/installer/0.log" Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: I0313 10:49:47.469262 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-1-master-0" event={"ID":"cf18cf27-3f9c-4592-ad68-88ac6564cb0c","Type":"ContainerDied","Data":"dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd"} Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: I0313 10:49:47.469296 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dece2bd0cffea0651f506325cd2a3a36d272bacb77ec90644bea61095c0ef3bd" Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: I0313 10:49:47.469360 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-1-master-0" Mar 13 10:49:47.471415 master-0 kubenswrapper[17876]: I0313 10:49:47.469963 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:47.485937 master-0 kubenswrapper[17876]: I0313 10:49:47.485885 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:47.486741 master-0 kubenswrapper[17876]: I0313 10:49:47.486707 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:48.478988 master-0 kubenswrapper[17876]: I0313 10:49:48.478874 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:48.478988 master-0 kubenswrapper[17876]: I0313 10:49:48.478921 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:49:48.480115 master-0 kubenswrapper[17876]: E0313 10:49:48.479909 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:49:51.312511 master-0 kubenswrapper[17876]: E0313 10:49:51.312420 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:51.313080 master-0 kubenswrapper[17876]: E0313 10:49:51.313011 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:51.313924 master-0 kubenswrapper[17876]: E0313 10:49:51.313870 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:51.314602 master-0 kubenswrapper[17876]: E0313 10:49:51.314553 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:51.315168 master-0 kubenswrapper[17876]: E0313 10:49:51.315121 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:51.315236 master-0 kubenswrapper[17876]: I0313 10:49:51.315179 17876 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 10:49:51.315771 master-0 kubenswrapper[17876]: E0313 10:49:51.315738 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 13 10:49:51.518085 master-0 kubenswrapper[17876]: E0313 10:49:51.517964 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 13 10:49:51.919348 master-0 kubenswrapper[17876]: E0313 10:49:51.919294 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 13 10:49:52.508262 master-0 kubenswrapper[17876]: I0313 10:49:52.508186 17876 status_manager.go:851] "Failed to get status for pod" podUID="1453f6461bf5d599ad65a4656343ee91" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:52.509576 master-0 kubenswrapper[17876]: I0313 10:49:52.509504 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:52.510598 master-0 kubenswrapper[17876]: I0313 10:49:52.510484 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:52.721155 master-0 kubenswrapper[17876]: E0313 10:49:52.721044 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 13 10:49:53.493786 master-0 kubenswrapper[17876]: I0313 10:49:53.493720 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:53.495833 master-0 kubenswrapper[17876]: I0313 10:49:53.495758 17876 status_manager.go:851] "Failed to get status for pod" podUID="1453f6461bf5d599ad65a4656343ee91" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:53.496649 master-0 kubenswrapper[17876]: I0313 10:49:53.496594 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:53.497398 master-0 kubenswrapper[17876]: I0313 10:49:53.497355 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:53.523697 master-0 kubenswrapper[17876]: I0313 10:49:53.523646 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:53.523697 master-0 kubenswrapper[17876]: I0313 10:49:53.523682 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:53.524523 master-0 kubenswrapper[17876]: E0313 10:49:53.524457 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:53.525378 master-0 kubenswrapper[17876]: I0313 10:49:53.525316 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:53.568315 master-0 kubenswrapper[17876]: W0313 10:49:53.568267 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36d4251d3504cdc0ec85144c1379056c.slice/crio-bb942b5755eb238a0f275f6842bf4bc57adc84cc8f7a4b9393b571e49ce72ce4 WatchSource:0}: Error finding container bb942b5755eb238a0f275f6842bf4bc57adc84cc8f7a4b9393b571e49ce72ce4: Status 404 returned error can't find the container with id bb942b5755eb238a0f275f6842bf4bc57adc84cc8f7a4b9393b571e49ce72ce4 Mar 13 10:49:54.323340 master-0 kubenswrapper[17876]: E0313 10:49:54.323233 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 13 10:49:54.534254 master-0 kubenswrapper[17876]: I0313 10:49:54.534200 17876 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="85b5152b5f2d55c06544a218e0f1af3ce624bcee5405bd1e01a288ec9e2e2cf6" exitCode=0 Mar 13 10:49:54.534895 master-0 kubenswrapper[17876]: I0313 10:49:54.534258 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"85b5152b5f2d55c06544a218e0f1af3ce624bcee5405bd1e01a288ec9e2e2cf6"} Mar 13 10:49:54.535043 master-0 kubenswrapper[17876]: I0313 10:49:54.535019 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"bb942b5755eb238a0f275f6842bf4bc57adc84cc8f7a4b9393b571e49ce72ce4"} Mar 13 10:49:54.535398 master-0 kubenswrapper[17876]: I0313 10:49:54.535358 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:54.535469 master-0 kubenswrapper[17876]: I0313 10:49:54.535403 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:54.536192 master-0 kubenswrapper[17876]: I0313 10:49:54.536131 17876 status_manager.go:851] "Failed to get status for pod" podUID="1453f6461bf5d599ad65a4656343ee91" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-scheduler/pods/openshift-kube-scheduler-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:54.536311 master-0 kubenswrapper[17876]: E0313 10:49:54.536159 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:54.536850 master-0 kubenswrapper[17876]: I0313 10:49:54.536789 17876 status_manager.go:851] "Failed to get status for pod" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" pod="openshift-kube-apiserver/installer-5-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-5-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:54.537708 master-0 kubenswrapper[17876]: I0313 10:49:54.537456 17876 status_manager.go:851] "Failed to get status for pod" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" pod="openshift-etcd/installer-2-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-etcd/pods/installer-2-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 13 10:49:55.545172 master-0 kubenswrapper[17876]: I0313 10:49:55.545112 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"7d795b2a4120951cab58f7dc86deb216ded65952d82db5b03d506bcb6832ee11"} Mar 13 10:49:55.545172 master-0 kubenswrapper[17876]: I0313 10:49:55.545172 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"dca891bbe7510f0478017374ef6fee957144d9f4d99956d189d47eea8dd8b22b"} Mar 13 10:49:55.545172 master-0 kubenswrapper[17876]: I0313 10:49:55.545186 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"e4889e63077351ef42a5c8064b74b8e1ac82458412f95c85ab6e1eb07948f2f1"} Mar 13 10:49:56.552976 master-0 kubenswrapper[17876]: I0313 10:49:56.552925 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_49ac3eb6-dc4c-4dbf-962f-2050bf32db6f/installer/0.log" Mar 13 10:49:56.553558 master-0 kubenswrapper[17876]: I0313 10:49:56.552984 17876 generic.go:334] "Generic (PLEG): container finished" podID="49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" containerID="b2fd05fc9a11fcab9e7bbac5f4059065fa0a4629a30ce979706fe605af5f593a" exitCode=1 Mar 13 10:49:56.553558 master-0 kubenswrapper[17876]: I0313 10:49:56.553040 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f","Type":"ContainerDied","Data":"b2fd05fc9a11fcab9e7bbac5f4059065fa0a4629a30ce979706fe605af5f593a"} Mar 13 10:49:56.557106 master-0 kubenswrapper[17876]: I0313 10:49:56.557067 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"dc1b737ae610d9db8cc3e7354f475535c1e92828d8e61495c15665bb819c1139"} Mar 13 10:49:56.557195 master-0 kubenswrapper[17876]: I0313 10:49:56.557143 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"1933dea8b2c7bb36faefa3fab92f5ed7c24ebea8e1e3ab139c1d0e1092f9f843"} Mar 13 10:49:56.557478 master-0 kubenswrapper[17876]: I0313 10:49:56.557455 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:56.557478 master-0 kubenswrapper[17876]: I0313 10:49:56.557474 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:49:56.557715 master-0 kubenswrapper[17876]: I0313 10:49:56.557703 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:56.564390 master-0 kubenswrapper[17876]: I0313 10:49:56.564346 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/0.log" Mar 13 10:49:56.564483 master-0 kubenswrapper[17876]: I0313 10:49:56.564419 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="d671010960870e0da2e9b058c8ed3d53e5393353d4ea9421bce18bd58bb8d5d1" exitCode=1 Mar 13 10:49:56.564533 master-0 kubenswrapper[17876]: I0313 10:49:56.564485 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerDied","Data":"d671010960870e0da2e9b058c8ed3d53e5393353d4ea9421bce18bd58bb8d5d1"} Mar 13 10:49:56.565325 master-0 kubenswrapper[17876]: I0313 10:49:56.565296 17876 scope.go:117] "RemoveContainer" containerID="d671010960870e0da2e9b058c8ed3d53e5393353d4ea9421bce18bd58bb8d5d1" Mar 13 10:49:57.577493 master-0 kubenswrapper[17876]: I0313 10:49:57.577438 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/0.log" Mar 13 10:49:57.578017 master-0 kubenswrapper[17876]: I0313 10:49:57.577751 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} Mar 13 10:49:58.526328 master-0 kubenswrapper[17876]: I0313 10:49:58.526281 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:58.526468 master-0 kubenswrapper[17876]: I0313 10:49:58.526350 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: I0313 10:49:58.531650 17876 patch_prober.go:28] interesting pod/kube-apiserver-master-0 container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]log ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]etcd ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/generic-apiserver-start-informers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/priority-and-fairness-filter ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-apiextensions-informers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-apiextensions-controllers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/crd-informer-synced ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-system-namespaces-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/scheduling/bootstrap-system-priority-classes ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/bootstrap-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/start-kube-aggregator-informers ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-registration-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-discovery-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]autoregister-completion ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-openapi-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 13 10:49:58.531688 master-0 kubenswrapper[17876]: livez check failed Mar 13 10:49:58.532823 master-0 kubenswrapper[17876]: I0313 10:49:58.531718 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="36d4251d3504cdc0ec85144c1379056c" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 13 10:49:58.544247 master-0 kubenswrapper[17876]: I0313 10:49:58.544217 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_49ac3eb6-dc4c-4dbf-962f-2050bf32db6f/installer/0.log" Mar 13 10:49:58.544441 master-0 kubenswrapper[17876]: I0313 10:49:58.544293 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:58.573147 master-0 kubenswrapper[17876]: I0313 10:49:58.573084 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir\") pod \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " Mar 13 10:49:58.573398 master-0 kubenswrapper[17876]: I0313 10:49:58.573228 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock\") pod \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " Mar 13 10:49:58.573398 master-0 kubenswrapper[17876]: I0313 10:49:58.573233 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" (UID: "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:58.573398 master-0 kubenswrapper[17876]: I0313 10:49:58.573266 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access\") pod \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\" (UID: \"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f\") " Mar 13 10:49:58.573398 master-0 kubenswrapper[17876]: I0313 10:49:58.573282 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock" (OuterVolumeSpecName: "var-lock") pod "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" (UID: "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:49:58.573821 master-0 kubenswrapper[17876]: I0313 10:49:58.573588 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:58.573821 master-0 kubenswrapper[17876]: I0313 10:49:58.573600 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:49:58.578938 master-0 kubenswrapper[17876]: I0313 10:49:58.578900 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" (UID: "49ac3eb6-dc4c-4dbf-962f-2050bf32db6f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:49:58.588461 master-0 kubenswrapper[17876]: I0313 10:49:58.588422 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-3-master-0_49ac3eb6-dc4c-4dbf-962f-2050bf32db6f/installer/0.log" Mar 13 10:49:58.588720 master-0 kubenswrapper[17876]: I0313 10:49:58.588511 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"49ac3eb6-dc4c-4dbf-962f-2050bf32db6f","Type":"ContainerDied","Data":"eda8d2a5dc35361bda6f85688598b6f99af96886aa879a478f990b99840a528c"} Mar 13 10:49:58.588720 master-0 kubenswrapper[17876]: I0313 10:49:58.588547 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eda8d2a5dc35361bda6f85688598b6f99af96886aa879a478f990b99840a528c" Mar 13 10:49:58.588720 master-0 kubenswrapper[17876]: I0313 10:49:58.588557 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 13 10:49:58.674145 master-0 kubenswrapper[17876]: I0313 10:49:58.674078 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/49ac3eb6-dc4c-4dbf-962f-2050bf32db6f-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:01.626317 master-0 kubenswrapper[17876]: I0313 10:50:01.626263 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:50:01.738131 master-0 kubenswrapper[17876]: I0313 10:50:01.738057 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="512003e7-2e4d-4d19-8806-860f99bcb149" Mar 13 10:50:02.648196 master-0 kubenswrapper[17876]: I0313 10:50:02.648060 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:02.648196 master-0 kubenswrapper[17876]: I0313 10:50:02.648184 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:03.090488 master-0 kubenswrapper[17876]: I0313 10:50:03.090372 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:50:03.090801 master-0 kubenswrapper[17876]: I0313 10:50:03.090783 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:50:03.090901 master-0 kubenswrapper[17876]: I0313 10:50:03.090714 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 10:50:03.091023 master-0 kubenswrapper[17876]: I0313 10:50:03.090995 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 10:50:03.530959 master-0 kubenswrapper[17876]: I0313 10:50:03.530876 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:50:03.655584 master-0 kubenswrapper[17876]: I0313 10:50:03.655490 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:03.655584 master-0 kubenswrapper[17876]: I0313 10:50:03.655523 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:03.662257 master-0 kubenswrapper[17876]: I0313 10:50:03.662188 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:50:03.666801 master-0 kubenswrapper[17876]: I0313 10:50:03.666718 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="512003e7-2e4d-4d19-8806-860f99bcb149" Mar 13 10:50:04.664082 master-0 kubenswrapper[17876]: I0313 10:50:04.664003 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:04.664082 master-0 kubenswrapper[17876]: I0313 10:50:04.664059 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:12.447032 master-0 kubenswrapper[17876]: I0313 10:50:12.446933 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 13 10:50:12.545291 master-0 kubenswrapper[17876]: I0313 10:50:12.545218 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="36d4251d3504cdc0ec85144c1379056c" podUID="512003e7-2e4d-4d19-8806-860f99bcb149" Mar 13 10:50:13.084722 master-0 kubenswrapper[17876]: I0313 10:50:13.084650 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 13 10:50:13.091018 master-0 kubenswrapper[17876]: I0313 10:50:13.090968 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 10:50:13.091173 master-0 kubenswrapper[17876]: I0313 10:50:13.091028 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 10:50:13.260657 master-0 kubenswrapper[17876]: I0313 10:50:13.260588 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 13 10:50:13.297874 master-0 kubenswrapper[17876]: I0313 10:50:13.297797 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 13 10:50:13.396584 master-0 kubenswrapper[17876]: I0313 10:50:13.396510 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-d2pmx" Mar 13 10:50:13.603896 master-0 kubenswrapper[17876]: I0313 10:50:13.603802 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 13 10:50:13.659982 master-0 kubenswrapper[17876]: I0313 10:50:13.659765 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-qddwq" Mar 13 10:50:13.914303 master-0 kubenswrapper[17876]: I0313 10:50:13.914184 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 13 10:50:13.924745 master-0 kubenswrapper[17876]: I0313 10:50:13.924694 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 13 10:50:13.945192 master-0 kubenswrapper[17876]: I0313 10:50:13.945138 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-r9v82" Mar 13 10:50:13.970510 master-0 kubenswrapper[17876]: I0313 10:50:13.970448 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 10:50:14.503354 master-0 kubenswrapper[17876]: I0313 10:50:14.503290 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 13 10:50:14.527125 master-0 kubenswrapper[17876]: I0313 10:50:14.527056 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 13 10:50:14.541247 master-0 kubenswrapper[17876]: I0313 10:50:14.541204 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 13 10:50:14.556976 master-0 kubenswrapper[17876]: I0313 10:50:14.556910 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 13 10:50:14.593342 master-0 kubenswrapper[17876]: I0313 10:50:14.593282 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 13 10:50:14.662223 master-0 kubenswrapper[17876]: I0313 10:50:14.662091 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 13 10:50:14.881275 master-0 kubenswrapper[17876]: I0313 10:50:14.881204 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 13 10:50:14.904621 master-0 kubenswrapper[17876]: I0313 10:50:14.904522 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 13 10:50:14.945632 master-0 kubenswrapper[17876]: I0313 10:50:14.945530 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 13 10:50:15.061516 master-0 kubenswrapper[17876]: I0313 10:50:15.061423 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 13 10:50:15.100723 master-0 kubenswrapper[17876]: I0313 10:50:15.100636 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 13 10:50:15.108509 master-0 kubenswrapper[17876]: I0313 10:50:15.108450 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 13 10:50:15.243764 master-0 kubenswrapper[17876]: I0313 10:50:15.243632 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-5258b" Mar 13 10:50:15.269332 master-0 kubenswrapper[17876]: I0313 10:50:15.269277 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 13 10:50:15.320801 master-0 kubenswrapper[17876]: I0313 10:50:15.320741 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 13 10:50:15.325967 master-0 kubenswrapper[17876]: I0313 10:50:15.325901 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 10:50:15.335468 master-0 kubenswrapper[17876]: I0313 10:50:15.335412 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 13 10:50:15.591950 master-0 kubenswrapper[17876]: I0313 10:50:15.591806 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 13 10:50:15.608311 master-0 kubenswrapper[17876]: I0313 10:50:15.608234 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 13 10:50:15.668454 master-0 kubenswrapper[17876]: I0313 10:50:15.668372 17876 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 13 10:50:15.739442 master-0 kubenswrapper[17876]: I0313 10:50:15.739350 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 13 10:50:15.791044 master-0 kubenswrapper[17876]: I0313 10:50:15.790915 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:50:15.818302 master-0 kubenswrapper[17876]: I0313 10:50:15.818220 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 13 10:50:15.953820 master-0 kubenswrapper[17876]: I0313 10:50:15.953755 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 13 10:50:16.003862 master-0 kubenswrapper[17876]: I0313 10:50:16.003817 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 13 10:50:16.011472 master-0 kubenswrapper[17876]: I0313 10:50:16.011424 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 13 10:50:16.014060 master-0 kubenswrapper[17876]: I0313 10:50:16.014039 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 13 10:50:16.137393 master-0 kubenswrapper[17876]: I0313 10:50:16.137308 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 13 10:50:16.262275 master-0 kubenswrapper[17876]: I0313 10:50:16.262063 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 13 10:50:16.264149 master-0 kubenswrapper[17876]: I0313 10:50:16.264115 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 13 10:50:16.310258 master-0 kubenswrapper[17876]: I0313 10:50:16.310199 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-6g1360gbh170n" Mar 13 10:50:16.442697 master-0 kubenswrapper[17876]: I0313 10:50:16.442631 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 13 10:50:16.449682 master-0 kubenswrapper[17876]: I0313 10:50:16.449614 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 13 10:50:16.450121 master-0 kubenswrapper[17876]: I0313 10:50:16.450068 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 13 10:50:16.451155 master-0 kubenswrapper[17876]: I0313 10:50:16.451086 17876 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 13 10:50:16.458755 master-0 kubenswrapper[17876]: I0313 10:50:16.458697 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 13 10:50:16.458830 master-0 kubenswrapper[17876]: I0313 10:50:16.458778 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0","openshift-storage/lvms-operator-97d47c4cb-g5jgs"] Mar 13 10:50:16.459153 master-0 kubenswrapper[17876]: E0313 10:50:16.459116 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" containerName="installer" Mar 13 10:50:16.459153 master-0 kubenswrapper[17876]: I0313 10:50:16.459147 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" containerName="installer" Mar 13 10:50:16.459239 master-0 kubenswrapper[17876]: E0313 10:50:16.459197 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" containerName="installer" Mar 13 10:50:16.459239 master-0 kubenswrapper[17876]: I0313 10:50:16.459204 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:16.459239 master-0 kubenswrapper[17876]: I0313 10:50:16.459229 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="ce5d10c9-98a8-43dd-945e-e0aec4d44437" Mar 13 10:50:16.459363 master-0 kubenswrapper[17876]: I0313 10:50:16.459214 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" containerName="installer" Mar 13 10:50:16.459363 master-0 kubenswrapper[17876]: E0313 10:50:16.459277 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" containerName="installer" Mar 13 10:50:16.459363 master-0 kubenswrapper[17876]: I0313 10:50:16.459300 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" containerName="installer" Mar 13 10:50:16.459561 master-0 kubenswrapper[17876]: I0313 10:50:16.459526 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="643dd13f-bd5e-432a-98dc-26ef29a54238" containerName="installer" Mar 13 10:50:16.459622 master-0 kubenswrapper[17876]: I0313 10:50:16.459576 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="49ac3eb6-dc4c-4dbf-962f-2050bf32db6f" containerName="installer" Mar 13 10:50:16.459622 master-0 kubenswrapper[17876]: I0313 10:50:16.459595 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf18cf27-3f9c-4592-ad68-88ac6564cb0c" containerName="installer" Mar 13 10:50:16.460300 master-0 kubenswrapper[17876]: I0313 10:50:16.460275 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.463402 master-0 kubenswrapper[17876]: I0313 10:50:16.463345 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 13 10:50:16.463504 master-0 kubenswrapper[17876]: I0313 10:50:16.463403 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 13 10:50:16.463504 master-0 kubenswrapper[17876]: I0313 10:50:16.463402 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 13 10:50:16.463671 master-0 kubenswrapper[17876]: I0313 10:50:16.463358 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 13 10:50:16.463792 master-0 kubenswrapper[17876]: I0313 10:50:16.463771 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 13 10:50:16.465045 master-0 kubenswrapper[17876]: I0313 10:50:16.465009 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 13 10:50:16.485174 master-0 kubenswrapper[17876]: I0313 10:50:16.485045 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 13 10:50:16.488953 master-0 kubenswrapper[17876]: I0313 10:50:16.488866 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=15.488839272 podStartE2EDuration="15.488839272s" podCreationTimestamp="2026-03-13 10:50:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:50:16.485000882 +0000 UTC m=+524.320807378" watchObservedRunningTime="2026-03-13 10:50:16.488839272 +0000 UTC m=+524.324645758" Mar 13 10:50:16.500593 master-0 kubenswrapper[17876]: I0313 10:50:16.500541 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79jbz\" (UniqueName: \"kubernetes.io/projected/2de66bef-f924-4f8e-82d5-01d587573f51-kube-api-access-79jbz\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.500712 master-0 kubenswrapper[17876]: I0313 10:50:16.500607 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2de66bef-f924-4f8e-82d5-01d587573f51-socket-dir\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.500712 master-0 kubenswrapper[17876]: I0313 10:50:16.500677 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-webhook-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.500897 master-0 kubenswrapper[17876]: I0313 10:50:16.500824 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-apiservice-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.500897 master-0 kubenswrapper[17876]: I0313 10:50:16.500866 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-metrics-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.563915 master-0 kubenswrapper[17876]: I0313 10:50:16.563667 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 13 10:50:16.601158 master-0 kubenswrapper[17876]: I0313 10:50:16.600764 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:50:16.602603 master-0 kubenswrapper[17876]: I0313 10:50:16.602558 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79jbz\" (UniqueName: \"kubernetes.io/projected/2de66bef-f924-4f8e-82d5-01d587573f51-kube-api-access-79jbz\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.602681 master-0 kubenswrapper[17876]: I0313 10:50:16.602628 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2de66bef-f924-4f8e-82d5-01d587573f51-socket-dir\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.602723 master-0 kubenswrapper[17876]: I0313 10:50:16.602681 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-webhook-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.603162 master-0 kubenswrapper[17876]: I0313 10:50:16.603035 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-apiservice-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.603162 master-0 kubenswrapper[17876]: I0313 10:50:16.603157 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-metrics-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.603476 master-0 kubenswrapper[17876]: I0313 10:50:16.603420 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/2de66bef-f924-4f8e-82d5-01d587573f51-socket-dir\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.607414 master-0 kubenswrapper[17876]: I0313 10:50:16.607362 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-apiservice-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.608388 master-0 kubenswrapper[17876]: I0313 10:50:16.608339 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-metrics-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.608616 master-0 kubenswrapper[17876]: I0313 10:50:16.608586 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/2de66bef-f924-4f8e-82d5-01d587573f51-webhook-cert\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.622588 master-0 kubenswrapper[17876]: I0313 10:50:16.622527 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79jbz\" (UniqueName: \"kubernetes.io/projected/2de66bef-f924-4f8e-82d5-01d587573f51-kube-api-access-79jbz\") pod \"lvms-operator-97d47c4cb-g5jgs\" (UID: \"2de66bef-f924-4f8e-82d5-01d587573f51\") " pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.667763 master-0 kubenswrapper[17876]: I0313 10:50:16.667687 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 13 10:50:16.683596 master-0 kubenswrapper[17876]: I0313 10:50:16.683549 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 13 10:50:16.684141 master-0 kubenswrapper[17876]: I0313 10:50:16.684023 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 13 10:50:16.692711 master-0 kubenswrapper[17876]: I0313 10:50:16.692660 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 13 10:50:16.775165 master-0 kubenswrapper[17876]: I0313 10:50:16.774599 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:16.810741 master-0 kubenswrapper[17876]: I0313 10:50:16.810700 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 13 10:50:16.891118 master-0 kubenswrapper[17876]: I0313 10:50:16.891049 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 13 10:50:16.896602 master-0 kubenswrapper[17876]: I0313 10:50:16.896534 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 10:50:16.978760 master-0 kubenswrapper[17876]: I0313 10:50:16.978663 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 13 10:50:17.016127 master-0 kubenswrapper[17876]: I0313 10:50:17.015715 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 13 10:50:17.050849 master-0 kubenswrapper[17876]: I0313 10:50:17.050794 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-9r4nm" Mar 13 10:50:17.111090 master-0 kubenswrapper[17876]: I0313 10:50:17.111047 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 10:50:17.127829 master-0 kubenswrapper[17876]: I0313 10:50:17.127753 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 13 10:50:17.258327 master-0 kubenswrapper[17876]: I0313 10:50:17.258201 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 13 10:50:17.333088 master-0 kubenswrapper[17876]: I0313 10:50:17.333046 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-h7hlp" Mar 13 10:50:17.339132 master-0 kubenswrapper[17876]: I0313 10:50:17.339091 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 13 10:50:17.469523 master-0 kubenswrapper[17876]: I0313 10:50:17.469452 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 13 10:50:17.617809 master-0 kubenswrapper[17876]: I0313 10:50:17.617685 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 13 10:50:17.617809 master-0 kubenswrapper[17876]: I0313 10:50:17.617750 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 13 10:50:17.617809 master-0 kubenswrapper[17876]: I0313 10:50:17.617754 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 13 10:50:17.637224 master-0 kubenswrapper[17876]: I0313 10:50:17.637178 17876 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 13 10:50:17.668214 master-0 kubenswrapper[17876]: I0313 10:50:17.668148 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 13 10:50:17.686299 master-0 kubenswrapper[17876]: I0313 10:50:17.686261 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:50:17.743171 master-0 kubenswrapper[17876]: I0313 10:50:17.743034 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 13 10:50:17.805468 master-0 kubenswrapper[17876]: I0313 10:50:17.805389 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 13 10:50:17.868774 master-0 kubenswrapper[17876]: I0313 10:50:17.868597 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-dr74kngie93ut" Mar 13 10:50:17.988456 master-0 kubenswrapper[17876]: I0313 10:50:17.988405 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 13 10:50:17.994193 master-0 kubenswrapper[17876]: I0313 10:50:17.994159 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 13 10:50:18.009485 master-0 kubenswrapper[17876]: I0313 10:50:18.009414 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 13 10:50:18.087073 master-0 kubenswrapper[17876]: I0313 10:50:18.086997 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 13 10:50:18.103944 master-0 kubenswrapper[17876]: I0313 10:50:18.103879 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 13 10:50:18.179353 master-0 kubenswrapper[17876]: I0313 10:50:18.179267 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 13 10:50:18.190985 master-0 kubenswrapper[17876]: I0313 10:50:18.190911 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 13 10:50:18.260273 master-0 kubenswrapper[17876]: I0313 10:50:18.260201 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-hggr7" Mar 13 10:50:18.333394 master-0 kubenswrapper[17876]: I0313 10:50:18.333295 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 13 10:50:18.392368 master-0 kubenswrapper[17876]: I0313 10:50:18.392309 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 13 10:50:18.442383 master-0 kubenswrapper[17876]: I0313 10:50:18.442244 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 13 10:50:18.519750 master-0 kubenswrapper[17876]: I0313 10:50:18.519698 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 13 10:50:18.607192 master-0 kubenswrapper[17876]: I0313 10:50:18.606944 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 13 10:50:18.607919 master-0 kubenswrapper[17876]: I0313 10:50:18.607871 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 13 10:50:18.625877 master-0 kubenswrapper[17876]: I0313 10:50:18.625817 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 13 10:50:18.632218 master-0 kubenswrapper[17876]: I0313 10:50:18.632142 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 13 10:50:18.704764 master-0 kubenswrapper[17876]: I0313 10:50:18.704632 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 13 10:50:18.770163 master-0 kubenswrapper[17876]: I0313 10:50:18.770088 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 13 10:50:18.790489 master-0 kubenswrapper[17876]: I0313 10:50:18.790420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 13 10:50:18.802287 master-0 kubenswrapper[17876]: I0313 10:50:18.802217 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 13 10:50:18.847222 master-0 kubenswrapper[17876]: I0313 10:50:18.844659 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 13 10:50:18.865202 master-0 kubenswrapper[17876]: I0313 10:50:18.865135 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 13 10:50:18.909828 master-0 kubenswrapper[17876]: I0313 10:50:18.909753 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 13 10:50:18.921636 master-0 kubenswrapper[17876]: I0313 10:50:18.921591 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 13 10:50:18.936410 master-0 kubenswrapper[17876]: I0313 10:50:18.936378 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 13 10:50:18.971272 master-0 kubenswrapper[17876]: I0313 10:50:18.971155 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 13 10:50:18.973649 master-0 kubenswrapper[17876]: I0313 10:50:18.973605 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 13 10:50:18.993071 master-0 kubenswrapper[17876]: I0313 10:50:18.993029 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 13 10:50:19.005963 master-0 kubenswrapper[17876]: I0313 10:50:19.005896 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 13 10:50:19.015789 master-0 kubenswrapper[17876]: I0313 10:50:19.015736 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 13 10:50:19.072806 master-0 kubenswrapper[17876]: I0313 10:50:19.072751 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 13 10:50:19.100022 master-0 kubenswrapper[17876]: I0313 10:50:19.098503 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 13 10:50:19.155483 master-0 kubenswrapper[17876]: I0313 10:50:19.155350 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 13 10:50:19.162302 master-0 kubenswrapper[17876]: I0313 10:50:19.162251 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 13 10:50:19.169018 master-0 kubenswrapper[17876]: I0313 10:50:19.168969 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 13 10:50:19.203385 master-0 kubenswrapper[17876]: I0313 10:50:19.203324 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-ff7d6" Mar 13 10:50:19.228159 master-0 kubenswrapper[17876]: I0313 10:50:19.228015 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 13 10:50:19.233192 master-0 kubenswrapper[17876]: I0313 10:50:19.233154 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 13 10:50:19.256084 master-0 kubenswrapper[17876]: I0313 10:50:19.256038 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 13 10:50:19.296684 master-0 kubenswrapper[17876]: I0313 10:50:19.296640 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 13 10:50:19.332742 master-0 kubenswrapper[17876]: I0313 10:50:19.332666 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 13 10:50:19.378903 master-0 kubenswrapper[17876]: I0313 10:50:19.378820 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 13 10:50:19.451211 master-0 kubenswrapper[17876]: I0313 10:50:19.451145 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 13 10:50:19.521227 master-0 kubenswrapper[17876]: I0313 10:50:19.521085 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 13 10:50:19.568649 master-0 kubenswrapper[17876]: I0313 10:50:19.568586 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 13 10:50:19.579010 master-0 kubenswrapper[17876]: I0313 10:50:19.578957 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 13 10:50:19.579686 master-0 kubenswrapper[17876]: I0313 10:50:19.579647 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 13 10:50:19.588012 master-0 kubenswrapper[17876]: I0313 10:50:19.587974 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 13 10:50:19.613809 master-0 kubenswrapper[17876]: I0313 10:50:19.613753 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-vggdd" Mar 13 10:50:19.628291 master-0 kubenswrapper[17876]: I0313 10:50:19.628226 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 13 10:50:19.641583 master-0 kubenswrapper[17876]: I0313 10:50:19.641539 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 13 10:50:19.718111 master-0 kubenswrapper[17876]: I0313 10:50:19.718029 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 13 10:50:19.779849 master-0 kubenswrapper[17876]: I0313 10:50:19.779712 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 13 10:50:19.807907 master-0 kubenswrapper[17876]: I0313 10:50:19.807848 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 13 10:50:19.859982 master-0 kubenswrapper[17876]: I0313 10:50:19.859909 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 13 10:50:19.894995 master-0 kubenswrapper[17876]: I0313 10:50:19.894919 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 13 10:50:19.930074 master-0 kubenswrapper[17876]: I0313 10:50:19.930013 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 10:50:20.020415 master-0 kubenswrapper[17876]: I0313 10:50:20.020339 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 13 10:50:20.179168 master-0 kubenswrapper[17876]: I0313 10:50:20.179055 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 13 10:50:20.179535 master-0 kubenswrapper[17876]: I0313 10:50:20.179398 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 13 10:50:20.206008 master-0 kubenswrapper[17876]: I0313 10:50:20.205269 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 13 10:50:20.233895 master-0 kubenswrapper[17876]: I0313 10:50:20.233848 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-c9gmn" Mar 13 10:50:20.250140 master-0 kubenswrapper[17876]: I0313 10:50:20.250040 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 13 10:50:20.340439 master-0 kubenswrapper[17876]: I0313 10:50:20.340356 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-2p9p4" Mar 13 10:50:20.349131 master-0 kubenswrapper[17876]: I0313 10:50:20.349073 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 13 10:50:20.481548 master-0 kubenswrapper[17876]: I0313 10:50:20.481351 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 13 10:50:20.546824 master-0 kubenswrapper[17876]: I0313 10:50:20.546731 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 13 10:50:20.606993 master-0 kubenswrapper[17876]: I0313 10:50:20.606920 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-dvqsb" Mar 13 10:50:20.617189 master-0 kubenswrapper[17876]: I0313 10:50:20.617062 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 13 10:50:20.634118 master-0 kubenswrapper[17876]: I0313 10:50:20.634013 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 10:50:20.674255 master-0 kubenswrapper[17876]: I0313 10:50:20.674167 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 10:50:20.711557 master-0 kubenswrapper[17876]: I0313 10:50:20.711472 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 13 10:50:20.761971 master-0 kubenswrapper[17876]: I0313 10:50:20.761699 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 13 10:50:20.783722 master-0 kubenswrapper[17876]: I0313 10:50:20.783649 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 13 10:50:20.939775 master-0 kubenswrapper[17876]: I0313 10:50:20.939713 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 13 10:50:21.008474 master-0 kubenswrapper[17876]: I0313 10:50:21.008385 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 13 10:50:21.081082 master-0 kubenswrapper[17876]: I0313 10:50:21.080932 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-894vf" Mar 13 10:50:21.081901 master-0 kubenswrapper[17876]: I0313 10:50:21.081853 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 13 10:50:21.139625 master-0 kubenswrapper[17876]: I0313 10:50:21.139544 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 13 10:50:21.157052 master-0 kubenswrapper[17876]: I0313 10:50:21.156975 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 13 10:50:21.283915 master-0 kubenswrapper[17876]: I0313 10:50:21.283803 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 13 10:50:21.293605 master-0 kubenswrapper[17876]: I0313 10:50:21.293499 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 13 10:50:21.316590 master-0 kubenswrapper[17876]: I0313 10:50:21.316515 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 13 10:50:21.347060 master-0 kubenswrapper[17876]: I0313 10:50:21.346872 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 13 10:50:21.386225 master-0 kubenswrapper[17876]: I0313 10:50:21.385863 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 13 10:50:21.418424 master-0 kubenswrapper[17876]: I0313 10:50:21.418331 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 13 10:50:21.487904 master-0 kubenswrapper[17876]: I0313 10:50:21.487845 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-t57pn" Mar 13 10:50:21.555261 master-0 kubenswrapper[17876]: I0313 10:50:21.555187 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 13 10:50:21.608866 master-0 kubenswrapper[17876]: I0313 10:50:21.608724 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 13 10:50:21.620569 master-0 kubenswrapper[17876]: I0313 10:50:21.620480 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 13 10:50:21.726794 master-0 kubenswrapper[17876]: I0313 10:50:21.726737 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 13 10:50:21.810545 master-0 kubenswrapper[17876]: I0313 10:50:21.810472 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-24kvc" Mar 13 10:50:21.858140 master-0 kubenswrapper[17876]: I0313 10:50:21.858022 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 13 10:50:21.939831 master-0 kubenswrapper[17876]: I0313 10:50:21.939744 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 13 10:50:21.942852 master-0 kubenswrapper[17876]: I0313 10:50:21.942789 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 13 10:50:22.056851 master-0 kubenswrapper[17876]: I0313 10:50:22.056741 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 13 10:50:22.084951 master-0 kubenswrapper[17876]: I0313 10:50:22.084861 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 13 10:50:22.090857 master-0 kubenswrapper[17876]: I0313 10:50:22.090781 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 13 10:50:22.129948 master-0 kubenswrapper[17876]: I0313 10:50:22.129837 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 13 10:50:22.206334 master-0 kubenswrapper[17876]: I0313 10:50:22.206164 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 13 10:50:22.301315 master-0 kubenswrapper[17876]: I0313 10:50:22.301215 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 13 10:50:22.333833 master-0 kubenswrapper[17876]: I0313 10:50:22.333742 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-tvfvf" Mar 13 10:50:22.336497 master-0 kubenswrapper[17876]: I0313 10:50:22.336462 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 13 10:50:22.379092 master-0 kubenswrapper[17876]: I0313 10:50:22.378986 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 13 10:50:22.410278 master-0 kubenswrapper[17876]: I0313 10:50:22.410191 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 13 10:50:22.479190 master-0 kubenswrapper[17876]: I0313 10:50:22.478959 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 13 10:50:22.479496 master-0 kubenswrapper[17876]: I0313 10:50:22.479333 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 13 10:50:22.489732 master-0 kubenswrapper[17876]: I0313 10:50:22.489664 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 13 10:50:22.616614 master-0 kubenswrapper[17876]: I0313 10:50:22.616547 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 10:50:22.624545 master-0 kubenswrapper[17876]: I0313 10:50:22.624471 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 13 10:50:22.658556 master-0 kubenswrapper[17876]: I0313 10:50:22.658478 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 13 10:50:22.747285 master-0 kubenswrapper[17876]: I0313 10:50:22.747140 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 13 10:50:22.756306 master-0 kubenswrapper[17876]: I0313 10:50:22.756248 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 13 10:50:22.762690 master-0 kubenswrapper[17876]: I0313 10:50:22.762625 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 13 10:50:22.770046 master-0 kubenswrapper[17876]: I0313 10:50:22.769990 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 13 10:50:22.778727 master-0 kubenswrapper[17876]: I0313 10:50:22.778683 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-fi4otoct6tdgf" Mar 13 10:50:22.792944 master-0 kubenswrapper[17876]: I0313 10:50:22.792879 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 13 10:50:22.819938 master-0 kubenswrapper[17876]: I0313 10:50:22.819862 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 13 10:50:22.853605 master-0 kubenswrapper[17876]: I0313 10:50:22.853513 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-2p4lb" Mar 13 10:50:22.918779 master-0 kubenswrapper[17876]: I0313 10:50:22.918705 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 13 10:50:23.002873 master-0 kubenswrapper[17876]: I0313 10:50:23.002728 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 10:50:23.008192 master-0 kubenswrapper[17876]: I0313 10:50:23.008156 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 13 10:50:23.027002 master-0 kubenswrapper[17876]: I0313 10:50:23.026959 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x4n7x" Mar 13 10:50:23.034534 master-0 kubenswrapper[17876]: I0313 10:50:23.034495 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 13 10:50:23.080951 master-0 kubenswrapper[17876]: I0313 10:50:23.080885 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 13 10:50:23.081381 master-0 kubenswrapper[17876]: I0313 10:50:23.081301 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" containerID="cri-o://a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec" gracePeriod=5 Mar 13 10:50:23.090509 master-0 kubenswrapper[17876]: I0313 10:50:23.090452 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" start-of-body= Mar 13 10:50:23.090690 master-0 kubenswrapper[17876]: I0313 10:50:23.090532 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": dial tcp 192.168.32.10:10257: connect: connection refused" Mar 13 10:50:23.090690 master-0 kubenswrapper[17876]: I0313 10:50:23.090609 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:50:23.091633 master-0 kubenswrapper[17876]: I0313 10:50:23.091588 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 13 10:50:23.091801 master-0 kubenswrapper[17876]: I0313 10:50:23.091766 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" containerID="cri-o://bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" gracePeriod=30 Mar 13 10:50:23.114969 master-0 kubenswrapper[17876]: I0313 10:50:23.114908 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:50:23.165067 master-0 kubenswrapper[17876]: I0313 10:50:23.165008 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 13 10:50:23.182275 master-0 kubenswrapper[17876]: I0313 10:50:23.182200 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-5pbvv" Mar 13 10:50:23.187371 master-0 kubenswrapper[17876]: I0313 10:50:23.187324 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 13 10:50:23.267231 master-0 kubenswrapper[17876]: I0313 10:50:23.266977 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 13 10:50:23.300123 master-0 kubenswrapper[17876]: I0313 10:50:23.300039 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 13 10:50:23.361605 master-0 kubenswrapper[17876]: I0313 10:50:23.361538 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 13 10:50:23.391505 master-0 kubenswrapper[17876]: I0313 10:50:23.391451 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 13 10:50:23.572143 master-0 kubenswrapper[17876]: I0313 10:50:23.571968 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 13 10:50:23.579243 master-0 kubenswrapper[17876]: I0313 10:50:23.579204 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 13 10:50:23.638282 master-0 kubenswrapper[17876]: I0313 10:50:23.638212 17876 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 13 10:50:23.675708 master-0 kubenswrapper[17876]: I0313 10:50:23.675654 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 13 10:50:23.681179 master-0 kubenswrapper[17876]: I0313 10:50:23.681155 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 13 10:50:23.724284 master-0 kubenswrapper[17876]: I0313 10:50:23.723954 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 10:50:23.747812 master-0 kubenswrapper[17876]: I0313 10:50:23.742946 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 13 10:50:23.747812 master-0 kubenswrapper[17876]: I0313 10:50:23.746040 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-clxlg" Mar 13 10:50:23.755282 master-0 kubenswrapper[17876]: I0313 10:50:23.755022 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 13 10:50:23.811785 master-0 kubenswrapper[17876]: I0313 10:50:23.811722 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 13 10:50:23.877131 master-0 kubenswrapper[17876]: I0313 10:50:23.877027 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 13 10:50:23.956430 master-0 kubenswrapper[17876]: I0313 10:50:23.956351 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 13 10:50:23.956870 master-0 kubenswrapper[17876]: I0313 10:50:23.956831 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 13 10:50:23.977000 master-0 kubenswrapper[17876]: I0313 10:50:23.976923 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 13 10:50:23.980741 master-0 kubenswrapper[17876]: I0313 10:50:23.980519 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-xsbqr" Mar 13 10:50:23.995536 master-0 kubenswrapper[17876]: I0313 10:50:23.995454 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 13 10:50:24.084714 master-0 kubenswrapper[17876]: I0313 10:50:24.084644 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 13 10:50:24.098628 master-0 kubenswrapper[17876]: I0313 10:50:24.098584 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 13 10:50:24.105850 master-0 kubenswrapper[17876]: I0313 10:50:24.105793 17876 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 13 10:50:24.164813 master-0 kubenswrapper[17876]: I0313 10:50:24.164665 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 13 10:50:24.211425 master-0 kubenswrapper[17876]: I0313 10:50:24.211346 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 13 10:50:24.218695 master-0 kubenswrapper[17876]: I0313 10:50:24.218624 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 10:50:24.226401 master-0 kubenswrapper[17876]: I0313 10:50:24.226329 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-jckzr" Mar 13 10:50:24.310117 master-0 kubenswrapper[17876]: I0313 10:50:24.310058 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 13 10:50:24.385607 master-0 kubenswrapper[17876]: I0313 10:50:24.385558 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-5lpgr" Mar 13 10:50:24.395282 master-0 kubenswrapper[17876]: I0313 10:50:24.395206 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 13 10:50:24.430024 master-0 kubenswrapper[17876]: I0313 10:50:24.429904 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 13 10:50:24.431844 master-0 kubenswrapper[17876]: I0313 10:50:24.431814 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 13 10:50:24.435324 master-0 kubenswrapper[17876]: I0313 10:50:24.435284 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 13 10:50:24.436863 master-0 kubenswrapper[17876]: I0313 10:50:24.436828 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 13 10:50:24.437215 master-0 kubenswrapper[17876]: I0313 10:50:24.437180 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 13 10:50:24.514204 master-0 kubenswrapper[17876]: I0313 10:50:24.514150 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 13 10:50:24.581458 master-0 kubenswrapper[17876]: I0313 10:50:24.581388 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 13 10:50:24.584076 master-0 kubenswrapper[17876]: I0313 10:50:24.584042 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 13 10:50:24.610482 master-0 kubenswrapper[17876]: I0313 10:50:24.610427 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 13 10:50:24.717084 master-0 kubenswrapper[17876]: I0313 10:50:24.716946 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 13 10:50:24.777352 master-0 kubenswrapper[17876]: I0313 10:50:24.777282 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 13 10:50:24.932141 master-0 kubenswrapper[17876]: I0313 10:50:24.932053 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-jpsrw" Mar 13 10:50:24.946807 master-0 kubenswrapper[17876]: I0313 10:50:24.946753 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-kkkpw" Mar 13 10:50:25.042814 master-0 kubenswrapper[17876]: I0313 10:50:25.042670 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 13 10:50:25.069948 master-0 kubenswrapper[17876]: I0313 10:50:25.069882 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 13 10:50:25.071474 master-0 kubenswrapper[17876]: I0313 10:50:25.071444 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 13 10:50:25.102865 master-0 kubenswrapper[17876]: I0313 10:50:25.102807 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-fsw7z" Mar 13 10:50:25.168124 master-0 kubenswrapper[17876]: I0313 10:50:25.168055 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 13 10:50:25.203449 master-0 kubenswrapper[17876]: I0313 10:50:25.203389 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 13 10:50:25.233948 master-0 kubenswrapper[17876]: I0313 10:50:25.233886 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 13 10:50:25.257148 master-0 kubenswrapper[17876]: I0313 10:50:25.257065 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 13 10:50:25.257434 master-0 kubenswrapper[17876]: I0313 10:50:25.257409 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 13 10:50:25.290976 master-0 kubenswrapper[17876]: I0313 10:50:25.290920 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 13 10:50:25.414630 master-0 kubenswrapper[17876]: I0313 10:50:25.414549 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 13 10:50:25.418150 master-0 kubenswrapper[17876]: I0313 10:50:25.418110 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 13 10:50:25.488677 master-0 kubenswrapper[17876]: I0313 10:50:25.488606 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 13 10:50:25.554761 master-0 kubenswrapper[17876]: I0313 10:50:25.554705 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 10:50:25.572822 master-0 kubenswrapper[17876]: I0313 10:50:25.572751 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-c2nqj" Mar 13 10:50:25.698297 master-0 kubenswrapper[17876]: I0313 10:50:25.698164 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 13 10:50:25.781804 master-0 kubenswrapper[17876]: I0313 10:50:25.781749 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 13 10:50:25.830399 master-0 kubenswrapper[17876]: I0313 10:50:25.830292 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 13 10:50:25.832527 master-0 kubenswrapper[17876]: I0313 10:50:25.832490 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 13 10:50:25.834928 master-0 kubenswrapper[17876]: I0313 10:50:25.834865 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 13 10:50:25.861491 master-0 kubenswrapper[17876]: I0313 10:50:25.861370 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 13 10:50:25.864531 master-0 kubenswrapper[17876]: I0313 10:50:25.864485 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 13 10:50:25.935174 master-0 kubenswrapper[17876]: I0313 10:50:25.935113 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 13 10:50:26.035181 master-0 kubenswrapper[17876]: I0313 10:50:26.034964 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 13 10:50:26.065129 master-0 kubenswrapper[17876]: I0313 10:50:26.065044 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 13 10:50:26.075509 master-0 kubenswrapper[17876]: I0313 10:50:26.075417 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 13 10:50:26.095383 master-0 kubenswrapper[17876]: I0313 10:50:26.095321 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 13 10:50:26.100675 master-0 kubenswrapper[17876]: I0313 10:50:26.100601 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 13 10:50:26.125762 master-0 kubenswrapper[17876]: I0313 10:50:26.125691 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 13 10:50:26.144561 master-0 kubenswrapper[17876]: I0313 10:50:26.144441 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 13 10:50:26.320501 master-0 kubenswrapper[17876]: I0313 10:50:26.320322 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 13 10:50:26.417568 master-0 kubenswrapper[17876]: I0313 10:50:26.417507 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 13 10:50:26.494532 master-0 kubenswrapper[17876]: I0313 10:50:26.494469 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 13 10:50:26.548302 master-0 kubenswrapper[17876]: I0313 10:50:26.548242 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 13 10:50:26.560405 master-0 kubenswrapper[17876]: I0313 10:50:26.560305 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 13 10:50:26.573665 master-0 kubenswrapper[17876]: I0313 10:50:26.573496 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 13 10:50:26.586214 master-0 kubenswrapper[17876]: I0313 10:50:26.586159 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 13 10:50:26.617333 master-0 kubenswrapper[17876]: I0313 10:50:26.617283 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 13 10:50:26.668051 master-0 kubenswrapper[17876]: I0313 10:50:26.667995 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 13 10:50:26.705890 master-0 kubenswrapper[17876]: I0313 10:50:26.705811 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 13 10:50:26.792039 master-0 kubenswrapper[17876]: I0313 10:50:26.791968 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 13 10:50:26.896763 master-0 kubenswrapper[17876]: I0313 10:50:26.896667 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 13 10:50:26.922601 master-0 kubenswrapper[17876]: I0313 10:50:26.922491 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 13 10:50:27.067235 master-0 kubenswrapper[17876]: I0313 10:50:27.067159 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 13 10:50:27.070494 master-0 kubenswrapper[17876]: I0313 10:50:27.070454 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 13 10:50:27.138554 master-0 kubenswrapper[17876]: I0313 10:50:27.138485 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 13 10:50:27.262579 master-0 kubenswrapper[17876]: I0313 10:50:27.262403 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 13 10:50:27.400534 master-0 kubenswrapper[17876]: I0313 10:50:27.400438 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 13 10:50:27.509899 master-0 kubenswrapper[17876]: I0313 10:50:27.508437 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 13 10:50:27.630355 master-0 kubenswrapper[17876]: I0313 10:50:27.630308 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 13 10:50:27.785064 master-0 kubenswrapper[17876]: I0313 10:50:27.784218 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 13 10:50:27.824983 master-0 kubenswrapper[17876]: I0313 10:50:27.824907 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 13 10:50:27.895860 master-0 kubenswrapper[17876]: I0313 10:50:27.895728 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 13 10:50:28.026345 master-0 kubenswrapper[17876]: I0313 10:50:28.026279 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 13 10:50:28.166660 master-0 kubenswrapper[17876]: I0313 10:50:28.166606 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 13 10:50:28.215822 master-0 kubenswrapper[17876]: I0313 10:50:28.215764 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 10:50:28.250016 master-0 kubenswrapper[17876]: I0313 10:50:28.249965 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 13 10:50:28.250232 master-0 kubenswrapper[17876]: I0313 10:50:28.250054 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:50:28.305585 master-0 kubenswrapper[17876]: I0313 10:50:28.305518 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 13 10:50:28.420813 master-0 kubenswrapper[17876]: I0313 10:50:28.420643 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 10:50:28.420813 master-0 kubenswrapper[17876]: I0313 10:50:28.420757 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 10:50:28.420813 master-0 kubenswrapper[17876]: I0313 10:50:28.420790 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 10:50:28.420813 master-0 kubenswrapper[17876]: I0313 10:50:28.420808 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.420853 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") pod \"a814bd60de133d95cf99630a978c017e\" (UID: \"a814bd60de133d95cf99630a978c017e\") " Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421023 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421126 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests" (OuterVolumeSpecName: "manifests") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421158 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log" (OuterVolumeSpecName: "var-log") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421178 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock" (OuterVolumeSpecName: "var-lock") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421189 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:28.421200 master-0 kubenswrapper[17876]: I0313 10:50:28.421202 17876 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-manifests\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:28.428198 master-0 kubenswrapper[17876]: I0313 10:50:28.428115 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "a814bd60de133d95cf99630a978c017e" (UID: "a814bd60de133d95cf99630a978c017e"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:50:28.468927 master-0 kubenswrapper[17876]: I0313 10:50:28.468849 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 13 10:50:28.504289 master-0 kubenswrapper[17876]: I0313 10:50:28.504213 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a814bd60de133d95cf99630a978c017e" path="/var/lib/kubelet/pods/a814bd60de133d95cf99630a978c017e/volumes" Mar 13 10:50:28.523635 master-0 kubenswrapper[17876]: I0313 10:50:28.523306 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:28.523635 master-0 kubenswrapper[17876]: I0313 10:50:28.523368 17876 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:28.523635 master-0 kubenswrapper[17876]: I0313 10:50:28.523398 17876 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/a814bd60de133d95cf99630a978c017e-var-log\") on node \"master-0\" DevicePath \"\"" Mar 13 10:50:28.887240 master-0 kubenswrapper[17876]: I0313 10:50:28.887169 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_a814bd60de133d95cf99630a978c017e/startup-monitor/0.log" Mar 13 10:50:28.887581 master-0 kubenswrapper[17876]: I0313 10:50:28.887252 17876 generic.go:334] "Generic (PLEG): container finished" podID="a814bd60de133d95cf99630a978c017e" containerID="a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec" exitCode=137 Mar 13 10:50:28.887581 master-0 kubenswrapper[17876]: I0313 10:50:28.887319 17876 scope.go:117] "RemoveContainer" containerID="a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec" Mar 13 10:50:28.887581 master-0 kubenswrapper[17876]: I0313 10:50:28.887481 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 13 10:50:28.913392 master-0 kubenswrapper[17876]: I0313 10:50:28.913345 17876 scope.go:117] "RemoveContainer" containerID="a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec" Mar 13 10:50:28.913982 master-0 kubenswrapper[17876]: E0313 10:50:28.913931 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec\": container with ID starting with a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec not found: ID does not exist" containerID="a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec" Mar 13 10:50:28.914047 master-0 kubenswrapper[17876]: I0313 10:50:28.913990 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec"} err="failed to get container status \"a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec\": rpc error: code = NotFound desc = could not find container \"a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec\": container with ID starting with a1e70891b6d7c011014831361cccd44f6bef06bd4013f0c5362d940a239322ec not found: ID does not exist" Mar 13 10:50:29.006397 master-0 kubenswrapper[17876]: I0313 10:50:29.006322 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 13 10:50:29.043460 master-0 kubenswrapper[17876]: I0313 10:50:29.043382 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 13 10:50:29.159266 master-0 kubenswrapper[17876]: I0313 10:50:29.158904 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 13 10:50:29.215887 master-0 kubenswrapper[17876]: I0313 10:50:29.215807 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 13 10:50:29.273476 master-0 kubenswrapper[17876]: I0313 10:50:29.273426 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 13 10:50:29.337263 master-0 kubenswrapper[17876]: I0313 10:50:29.337206 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 13 10:50:29.342815 master-0 kubenswrapper[17876]: I0313 10:50:29.342795 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 13 10:50:29.392066 master-0 kubenswrapper[17876]: I0313 10:50:29.392002 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 13 10:50:29.817525 master-0 kubenswrapper[17876]: I0313 10:50:29.817466 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 13 10:50:30.235241 master-0 kubenswrapper[17876]: I0313 10:50:30.235076 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 13 10:50:30.328456 master-0 kubenswrapper[17876]: I0313 10:50:30.328388 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-7cjkc" Mar 13 10:50:31.117835 master-0 kubenswrapper[17876]: I0313 10:50:31.116192 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-97d47c4cb-g5jgs"] Mar 13 10:50:31.329849 master-0 kubenswrapper[17876]: I0313 10:50:31.329773 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 13 10:50:31.615086 master-0 kubenswrapper[17876]: I0313 10:50:31.615017 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-97d47c4cb-g5jgs"] Mar 13 10:50:31.627084 master-0 kubenswrapper[17876]: W0313 10:50:31.627030 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2de66bef_f924_4f8e_82d5_01d587573f51.slice/crio-63b3adc38ce05efd4f6bf72c19e1505c89c81162e69f2b412c1b72b0b0eb090a WatchSource:0}: Error finding container 63b3adc38ce05efd4f6bf72c19e1505c89c81162e69f2b412c1b72b0b0eb090a: Status 404 returned error can't find the container with id 63b3adc38ce05efd4f6bf72c19e1505c89c81162e69f2b412c1b72b0b0eb090a Mar 13 10:50:31.911608 master-0 kubenswrapper[17876]: I0313 10:50:31.911530 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" event={"ID":"2de66bef-f924-4f8e-82d5-01d587573f51","Type":"ContainerStarted","Data":"63b3adc38ce05efd4f6bf72c19e1505c89c81162e69f2b412c1b72b0b0eb090a"} Mar 13 10:50:35.516938 master-0 kubenswrapper[17876]: I0313 10:50:35.516825 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:50:35.518144 master-0 kubenswrapper[17876]: I0313 10:50:35.517399 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:50:35.518144 master-0 kubenswrapper[17876]: I0313 10:50:35.517415 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:50:35.531146 master-0 kubenswrapper[17876]: I0313 10:50:35.530332 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:50:35.533819 master-0 kubenswrapper[17876]: I0313 10:50:35.533738 17876 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e1c305ec-b38e-436b-a891-afbc33d7d70e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-13T10:50:35Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-13T10:50:35Z\\\",\\\"message\\\":null,\\\"reason\\\":null,\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9f53cc3cddb8fe9d1088b7766a1921dd54985febb851e44b5536925b781b058e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T10:49:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://355bf3d5ec99886176b83dda427870237f037b0cd1a25b7014ba793fd220e769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T10:49:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e2c88e4c1fc855558d16a23967e91f39d888b2b5d567204372568c3c9fe0b418\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-13T10:49:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}]}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-master-0\": pods \"openshift-kube-scheduler-master-0\" not found" Mar 13 10:50:35.536861 master-0 kubenswrapper[17876]: I0313 10:50:35.536826 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:50:35.545192 master-0 kubenswrapper[17876]: I0313 10:50:35.545136 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:50:35.553511 master-0 kubenswrapper[17876]: I0313 10:50:35.553433 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 13 10:50:36.296687 master-0 kubenswrapper[17876]: I0313 10:50:36.296606 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:50:36.296687 master-0 kubenswrapper[17876]: I0313 10:50:36.296653 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="e1c305ec-b38e-436b-a891-afbc33d7d70e" Mar 13 10:50:37.308188 master-0 kubenswrapper[17876]: I0313 10:50:37.308012 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" event={"ID":"2de66bef-f924-4f8e-82d5-01d587573f51","Type":"ContainerStarted","Data":"889b840de14ed519ac11a1bfa149c056f43a9205ab203625bbbc54d8bf797874"} Mar 13 10:50:37.308188 master-0 kubenswrapper[17876]: I0313 10:50:37.308158 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:37.312212 master-0 kubenswrapper[17876]: I0313 10:50:37.312146 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" Mar 13 10:50:37.343165 master-0 kubenswrapper[17876]: I0313 10:50:37.343015 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-97d47c4cb-g5jgs" podStartSLOduration=52.429614107 podStartE2EDuration="57.34298606s" podCreationTimestamp="2026-03-13 10:49:40 +0000 UTC" firstStartedPulling="2026-03-13 10:50:31.630959277 +0000 UTC m=+539.466765753" lastFinishedPulling="2026-03-13 10:50:36.54433124 +0000 UTC m=+544.380137706" observedRunningTime="2026-03-13 10:50:37.342727953 +0000 UTC m=+545.178534449" watchObservedRunningTime="2026-03-13 10:50:37.34298606 +0000 UTC m=+545.178792556" Mar 13 10:50:37.387764 master-0 kubenswrapper[17876]: I0313 10:50:37.387661 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.38763653 podStartE2EDuration="2.38763653s" podCreationTimestamp="2026-03-13 10:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:50:37.363507313 +0000 UTC m=+545.199313809" watchObservedRunningTime="2026-03-13 10:50:37.38763653 +0000 UTC m=+545.223443026" Mar 13 10:50:53.457251 master-0 kubenswrapper[17876]: I0313 10:50:53.454587 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/1.log" Mar 13 10:50:53.457251 master-0 kubenswrapper[17876]: I0313 10:50:53.456038 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/0.log" Mar 13 10:50:53.457251 master-0 kubenswrapper[17876]: I0313 10:50:53.456080 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" exitCode=137 Mar 13 10:50:53.457251 master-0 kubenswrapper[17876]: I0313 10:50:53.456138 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerDied","Data":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} Mar 13 10:50:53.457251 master-0 kubenswrapper[17876]: I0313 10:50:53.456181 17876 scope.go:117] "RemoveContainer" containerID="d671010960870e0da2e9b058c8ed3d53e5393353d4ea9421bce18bd58bb8d5d1" Mar 13 10:50:54.466693 master-0 kubenswrapper[17876]: I0313 10:50:54.466628 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/1.log" Mar 13 10:50:54.468164 master-0 kubenswrapper[17876]: I0313 10:50:54.468083 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"e9fc87edb050c91d1c07246e5eb5386e","Type":"ContainerStarted","Data":"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a"} Mar 13 10:51:03.090735 master-0 kubenswrapper[17876]: I0313 10:51:03.090650 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:03.090735 master-0 kubenswrapper[17876]: I0313 10:51:03.090713 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:03.095747 master-0 kubenswrapper[17876]: I0313 10:51:03.095718 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:03.669820 master-0 kubenswrapper[17876]: I0313 10:51:03.669754 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:09.749659 master-0 kubenswrapper[17876]: I0313 10:51:09.749570 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 10:51:09.751284 master-0 kubenswrapper[17876]: E0313 10:51:09.751258 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 10:51:09.751422 master-0 kubenswrapper[17876]: I0313 10:51:09.751401 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 10:51:09.751724 master-0 kubenswrapper[17876]: I0313 10:51:09.751699 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="a814bd60de133d95cf99630a978c017e" containerName="startup-monitor" Mar 13 10:51:09.752573 master-0 kubenswrapper[17876]: I0313 10:51:09.752548 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.755889 master-0 kubenswrapper[17876]: I0313 10:51:09.755808 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 13 10:51:09.755889 master-0 kubenswrapper[17876]: I0313 10:51:09.755864 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-wbwm2" Mar 13 10:51:09.764433 master-0 kubenswrapper[17876]: I0313 10:51:09.764349 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 10:51:09.807374 master-0 kubenswrapper[17876]: I0313 10:51:09.807332 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.807666 master-0 kubenswrapper[17876]: I0313 10:51:09.807648 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.807803 master-0 kubenswrapper[17876]: I0313 10:51:09.807772 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.909533 master-0 kubenswrapper[17876]: I0313 10:51:09.909471 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.909533 master-0 kubenswrapper[17876]: I0313 10:51:09.909536 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.909841 master-0 kubenswrapper[17876]: I0313 10:51:09.909685 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.909841 master-0 kubenswrapper[17876]: I0313 10:51:09.909789 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.910030 master-0 kubenswrapper[17876]: I0313 10:51:09.909980 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:09.933437 master-0 kubenswrapper[17876]: I0313 10:51:09.933393 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access\") pod \"installer-4-master-0\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:10.139084 master-0 kubenswrapper[17876]: I0313 10:51:10.139017 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:10.616491 master-0 kubenswrapper[17876]: I0313 10:51:10.616436 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 13 10:51:10.706932 master-0 kubenswrapper[17876]: W0313 10:51:10.706748 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb7fd7f4f_4bd7_48af_8304_e4f74bc1196c.slice/crio-d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943 WatchSource:0}: Error finding container d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943: Status 404 returned error can't find the container with id d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943 Mar 13 10:51:11.226547 master-0 kubenswrapper[17876]: I0313 10:51:11.226359 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c","Type":"ContainerStarted","Data":"4524db8dcbf35e7e123d8271697122d65b897c834b288577fdacdf4a728d7b3d"} Mar 13 10:51:11.226547 master-0 kubenswrapper[17876]: I0313 10:51:11.226427 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c","Type":"ContainerStarted","Data":"d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943"} Mar 13 10:51:11.250171 master-0 kubenswrapper[17876]: I0313 10:51:11.250043 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=2.250003693 podStartE2EDuration="2.250003693s" podCreationTimestamp="2026-03-13 10:51:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:51:11.244176317 +0000 UTC m=+579.079982803" watchObservedRunningTime="2026-03-13 10:51:11.250003693 +0000 UTC m=+579.085810179" Mar 13 10:51:17.270720 master-0 kubenswrapper[17876]: I0313 10:51:17.270597 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf"] Mar 13 10:51:17.272202 master-0 kubenswrapper[17876]: I0313 10:51:17.272177 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.286798 master-0 kubenswrapper[17876]: I0313 10:51:17.286732 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf"] Mar 13 10:51:17.420165 master-0 kubenswrapper[17876]: I0313 10:51:17.420111 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnwrt\" (UniqueName: \"kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.420165 master-0 kubenswrapper[17876]: I0313 10:51:17.420170 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.420529 master-0 kubenswrapper[17876]: I0313 10:51:17.420203 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.422856 master-0 kubenswrapper[17876]: I0313 10:51:17.422775 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n"] Mar 13 10:51:17.424171 master-0 kubenswrapper[17876]: I0313 10:51:17.424127 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.435873 master-0 kubenswrapper[17876]: I0313 10:51:17.435821 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n"] Mar 13 10:51:17.521922 master-0 kubenswrapper[17876]: I0313 10:51:17.521778 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.521922 master-0 kubenswrapper[17876]: I0313 10:51:17.521865 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.522188 master-0 kubenswrapper[17876]: I0313 10:51:17.522136 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwzhw\" (UniqueName: \"kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.522718 master-0 kubenswrapper[17876]: I0313 10:51:17.522657 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnwrt\" (UniqueName: \"kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.522836 master-0 kubenswrapper[17876]: I0313 10:51:17.522762 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.523472 master-0 kubenswrapper[17876]: I0313 10:51:17.523437 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.523559 master-0 kubenswrapper[17876]: I0313 10:51:17.523524 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.523915 master-0 kubenswrapper[17876]: I0313 10:51:17.523882 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.538809 master-0 kubenswrapper[17876]: I0313 10:51:17.538719 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnwrt\" (UniqueName: \"kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.591138 master-0 kubenswrapper[17876]: I0313 10:51:17.591045 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:17.625485 master-0 kubenswrapper[17876]: I0313 10:51:17.625404 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.625485 master-0 kubenswrapper[17876]: I0313 10:51:17.625483 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.625761 master-0 kubenswrapper[17876]: I0313 10:51:17.625657 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwzhw\" (UniqueName: \"kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.626252 master-0 kubenswrapper[17876]: I0313 10:51:17.626206 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.626422 master-0 kubenswrapper[17876]: I0313 10:51:17.626290 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.652523 master-0 kubenswrapper[17876]: I0313 10:51:17.652408 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwzhw\" (UniqueName: \"kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.841083 master-0 kubenswrapper[17876]: I0313 10:51:17.811220 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:17.869580 master-0 kubenswrapper[17876]: I0313 10:51:17.869504 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-retry-2-master-0"] Mar 13 10:51:17.870898 master-0 kubenswrapper[17876]: I0313 10:51:17.870864 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:17.874063 master-0 kubenswrapper[17876]: I0313 10:51:17.873800 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 13 10:51:17.874455 master-0 kubenswrapper[17876]: I0313 10:51:17.874413 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-h29h2" Mar 13 10:51:17.894993 master-0 kubenswrapper[17876]: I0313 10:51:17.894946 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-retry-2-master-0"] Mar 13 10:51:17.916930 master-0 kubenswrapper[17876]: I0313 10:51:17.915019 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:17.916930 master-0 kubenswrapper[17876]: I0313 10:51:17.915172 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:17.916930 master-0 kubenswrapper[17876]: I0313 10:51:17.915194 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.017178 master-0 kubenswrapper[17876]: I0313 10:51:18.016323 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.017178 master-0 kubenswrapper[17876]: I0313 10:51:18.016449 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.017178 master-0 kubenswrapper[17876]: I0313 10:51:18.016479 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.017178 master-0 kubenswrapper[17876]: I0313 10:51:18.016977 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.017178 master-0 kubenswrapper[17876]: I0313 10:51:18.017024 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.037087 master-0 kubenswrapper[17876]: I0313 10:51:18.037044 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access\") pod \"installer-2-retry-2-master-0\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.196992 master-0 kubenswrapper[17876]: W0313 10:51:18.196944 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod896b3c4a_3958_4269_8689_e132e4110d20.slice/crio-624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a WatchSource:0}: Error finding container 624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a: Status 404 returned error can't find the container with id 624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a Mar 13 10:51:18.197266 master-0 kubenswrapper[17876]: I0313 10:51:18.196979 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf"] Mar 13 10:51:18.208945 master-0 kubenswrapper[17876]: I0313 10:51:18.208494 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:18.286926 master-0 kubenswrapper[17876]: I0313 10:51:18.286837 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n"] Mar 13 10:51:18.288830 master-0 kubenswrapper[17876]: W0313 10:51:18.288703 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13790eda_ee5d_4b78_a0d9_3e944cab6f2f.slice/crio-cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d WatchSource:0}: Error finding container cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d: Status 404 returned error can't find the container with id cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d Mar 13 10:51:18.379360 master-0 kubenswrapper[17876]: I0313 10:51:18.378293 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerStarted","Data":"37a5245305cd3dad9c82a9c2ad41d5972de29337f9abe8699c0187e04b1b6dd9"} Mar 13 10:51:18.379360 master-0 kubenswrapper[17876]: I0313 10:51:18.378390 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerStarted","Data":"624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a"} Mar 13 10:51:18.384489 master-0 kubenswrapper[17876]: I0313 10:51:18.384450 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" event={"ID":"13790eda-ee5d-4b78-a0d9-3e944cab6f2f","Type":"ContainerStarted","Data":"cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d"} Mar 13 10:51:18.465591 master-0 kubenswrapper[17876]: I0313 10:51:18.465504 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2"] Mar 13 10:51:18.473240 master-0 kubenswrapper[17876]: I0313 10:51:18.473180 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.479269 master-0 kubenswrapper[17876]: I0313 10:51:18.479205 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2"] Mar 13 10:51:18.651222 master-0 kubenswrapper[17876]: I0313 10:51:18.651160 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.651222 master-0 kubenswrapper[17876]: I0313 10:51:18.651226 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.651879 master-0 kubenswrapper[17876]: I0313 10:51:18.651648 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt8rq\" (UniqueName: \"kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.752963 master-0 kubenswrapper[17876]: I0313 10:51:18.752838 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.752963 master-0 kubenswrapper[17876]: I0313 10:51:18.752910 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.753358 master-0 kubenswrapper[17876]: I0313 10:51:18.752995 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wt8rq\" (UniqueName: \"kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.753751 master-0 kubenswrapper[17876]: I0313 10:51:18.753706 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.754472 master-0 kubenswrapper[17876]: I0313 10:51:18.754404 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.990125 master-0 kubenswrapper[17876]: I0313 10:51:18.989999 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt8rq\" (UniqueName: \"kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:18.993332 master-0 kubenswrapper[17876]: I0313 10:51:18.993244 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-retry-2-master-0"] Mar 13 10:51:19.116139 master-0 kubenswrapper[17876]: I0313 10:51:19.115682 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:19.404208 master-0 kubenswrapper[17876]: I0313 10:51:19.404053 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-2-master-0" event={"ID":"de9eb09a-0b9b-4190-b3ce-7eb971c93fae","Type":"ContainerStarted","Data":"f0f1c877867779e63634fae43ed1622dfd8e25252bf72898d372ceee897efc1c"} Mar 13 10:51:19.409419 master-0 kubenswrapper[17876]: I0313 10:51:19.409358 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerDied","Data":"37a5245305cd3dad9c82a9c2ad41d5972de29337f9abe8699c0187e04b1b6dd9"} Mar 13 10:51:19.410633 master-0 kubenswrapper[17876]: I0313 10:51:19.409246 17876 generic.go:334] "Generic (PLEG): container finished" podID="896b3c4a-3958-4269-8689-e132e4110d20" containerID="37a5245305cd3dad9c82a9c2ad41d5972de29337f9abe8699c0187e04b1b6dd9" exitCode=0 Mar 13 10:51:19.414295 master-0 kubenswrapper[17876]: I0313 10:51:19.414235 17876 generic.go:334] "Generic (PLEG): container finished" podID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerID="53fa8ba99e504ccf7848a97bfcd12c731ef532ebf1d68fed3360933a02f6509c" exitCode=0 Mar 13 10:51:19.414549 master-0 kubenswrapper[17876]: I0313 10:51:19.414310 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" event={"ID":"13790eda-ee5d-4b78-a0d9-3e944cab6f2f","Type":"ContainerDied","Data":"53fa8ba99e504ccf7848a97bfcd12c731ef532ebf1d68fed3360933a02f6509c"} Mar 13 10:51:19.548043 master-0 kubenswrapper[17876]: I0313 10:51:19.547996 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2"] Mar 13 10:51:19.549433 master-0 kubenswrapper[17876]: W0313 10:51:19.549392 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453b6535_0292_4fc0_aa92_c3ba224c0aa0.slice/crio-fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7 WatchSource:0}: Error finding container fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7: Status 404 returned error can't find the container with id fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7 Mar 13 10:51:19.922656 master-0 kubenswrapper[17876]: E0313 10:51:19.922574 17876 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod453b6535_0292_4fc0_aa92_c3ba224c0aa0.slice/crio-conmon-9f2470fe0e0b7cda521aef7d34e92da58b8cb378bdb740c84401f3db3137f334.scope\": RecentStats: unable to find data in memory cache]" Mar 13 10:51:20.423282 master-0 kubenswrapper[17876]: I0313 10:51:20.423207 17876 generic.go:334] "Generic (PLEG): container finished" podID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerID="9f2470fe0e0b7cda521aef7d34e92da58b8cb378bdb740c84401f3db3137f334" exitCode=0 Mar 13 10:51:20.423875 master-0 kubenswrapper[17876]: I0313 10:51:20.423289 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerDied","Data":"9f2470fe0e0b7cda521aef7d34e92da58b8cb378bdb740c84401f3db3137f334"} Mar 13 10:51:20.423875 master-0 kubenswrapper[17876]: I0313 10:51:20.423385 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerStarted","Data":"fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7"} Mar 13 10:51:20.426256 master-0 kubenswrapper[17876]: I0313 10:51:20.426125 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-2-master-0" event={"ID":"de9eb09a-0b9b-4190-b3ce-7eb971c93fae","Type":"ContainerStarted","Data":"1c3f13f13a92f1d9f52dca642a5da02bd099b4378330a2826bf3be767a78a8d7"} Mar 13 10:51:21.435498 master-0 kubenswrapper[17876]: I0313 10:51:21.435336 17876 generic.go:334] "Generic (PLEG): container finished" podID="896b3c4a-3958-4269-8689-e132e4110d20" containerID="21c7d5b413fa586be235dcb7a29258bd6bfe9cf12f2804d58978ddeeb510dc56" exitCode=0 Mar 13 10:51:21.435498 master-0 kubenswrapper[17876]: I0313 10:51:21.435402 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerDied","Data":"21c7d5b413fa586be235dcb7a29258bd6bfe9cf12f2804d58978ddeeb510dc56"} Mar 13 10:51:21.459151 master-0 kubenswrapper[17876]: I0313 10:51:21.459025 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-retry-2-master-0" podStartSLOduration=4.45900503 podStartE2EDuration="4.45900503s" podCreationTimestamp="2026-03-13 10:51:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:51:20.469168886 +0000 UTC m=+588.304975412" watchObservedRunningTime="2026-03-13 10:51:21.45900503 +0000 UTC m=+589.294811506" Mar 13 10:51:22.444668 master-0 kubenswrapper[17876]: I0313 10:51:22.444606 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerStarted","Data":"f2b18cd231926e8f3d5a9c7d9888af2bb8fd835b34b6b6de79c00d140c2bdedf"} Mar 13 10:51:22.456565 master-0 kubenswrapper[17876]: I0313 10:51:22.456503 17876 generic.go:334] "Generic (PLEG): container finished" podID="896b3c4a-3958-4269-8689-e132e4110d20" containerID="a4bb496df25f880694382d026ff74ce076f3adb118dbf9ab2808e0d6457ec79f" exitCode=0 Mar 13 10:51:22.456565 master-0 kubenswrapper[17876]: I0313 10:51:22.456569 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerDied","Data":"a4bb496df25f880694382d026ff74ce076f3adb118dbf9ab2808e0d6457ec79f"} Mar 13 10:51:23.889826 master-0 kubenswrapper[17876]: I0313 10:51:23.889783 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:23.959869 master-0 kubenswrapper[17876]: I0313 10:51:23.958892 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnwrt\" (UniqueName: \"kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt\") pod \"896b3c4a-3958-4269-8689-e132e4110d20\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " Mar 13 10:51:23.959869 master-0 kubenswrapper[17876]: I0313 10:51:23.959042 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle\") pod \"896b3c4a-3958-4269-8689-e132e4110d20\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " Mar 13 10:51:23.959869 master-0 kubenswrapper[17876]: I0313 10:51:23.959182 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util\") pod \"896b3c4a-3958-4269-8689-e132e4110d20\" (UID: \"896b3c4a-3958-4269-8689-e132e4110d20\") " Mar 13 10:51:23.961961 master-0 kubenswrapper[17876]: I0313 10:51:23.960429 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle" (OuterVolumeSpecName: "bundle") pod "896b3c4a-3958-4269-8689-e132e4110d20" (UID: "896b3c4a-3958-4269-8689-e132e4110d20"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:23.962321 master-0 kubenswrapper[17876]: I0313 10:51:23.962274 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt" (OuterVolumeSpecName: "kube-api-access-xnwrt") pod "896b3c4a-3958-4269-8689-e132e4110d20" (UID: "896b3c4a-3958-4269-8689-e132e4110d20"). InnerVolumeSpecName "kube-api-access-xnwrt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:23.986942 master-0 kubenswrapper[17876]: I0313 10:51:23.986849 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util" (OuterVolumeSpecName: "util") pod "896b3c4a-3958-4269-8689-e132e4110d20" (UID: "896b3c4a-3958-4269-8689-e132e4110d20"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:24.060655 master-0 kubenswrapper[17876]: I0313 10:51:24.060605 17876 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:24.060655 master-0 kubenswrapper[17876]: I0313 10:51:24.060641 17876 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/896b3c4a-3958-4269-8689-e132e4110d20-util\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:24.060655 master-0 kubenswrapper[17876]: I0313 10:51:24.060654 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnwrt\" (UniqueName: \"kubernetes.io/projected/896b3c4a-3958-4269-8689-e132e4110d20-kube-api-access-xnwrt\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:24.472212 master-0 kubenswrapper[17876]: I0313 10:51:24.472144 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerDied","Data":"f2b18cd231926e8f3d5a9c7d9888af2bb8fd835b34b6b6de79c00d140c2bdedf"} Mar 13 10:51:24.473862 master-0 kubenswrapper[17876]: I0313 10:51:24.473436 17876 generic.go:334] "Generic (PLEG): container finished" podID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerID="f2b18cd231926e8f3d5a9c7d9888af2bb8fd835b34b6b6de79c00d140c2bdedf" exitCode=0 Mar 13 10:51:24.477070 master-0 kubenswrapper[17876]: I0313 10:51:24.477021 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" event={"ID":"896b3c4a-3958-4269-8689-e132e4110d20","Type":"ContainerDied","Data":"624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a"} Mar 13 10:51:24.477070 master-0 kubenswrapper[17876]: I0313 10:51:24.477062 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="624ef0f317ceeaa70cc7d209e63259c405f7e6977236fb201573995ae5eace9a" Mar 13 10:51:24.477246 master-0 kubenswrapper[17876]: I0313 10:51:24.477109 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1lbvvf" Mar 13 10:51:24.483641 master-0 kubenswrapper[17876]: I0313 10:51:24.483609 17876 generic.go:334] "Generic (PLEG): container finished" podID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerID="5bef70e79fb983f32ee624a66be1d1b7ccb9a30ef5f1edd308c0cf43f1b84215" exitCode=0 Mar 13 10:51:24.483815 master-0 kubenswrapper[17876]: I0313 10:51:24.483752 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" event={"ID":"13790eda-ee5d-4b78-a0d9-3e944cab6f2f","Type":"ContainerDied","Data":"5bef70e79fb983f32ee624a66be1d1b7ccb9a30ef5f1edd308c0cf43f1b84215"} Mar 13 10:51:25.500759 master-0 kubenswrapper[17876]: I0313 10:51:25.500614 17876 generic.go:334] "Generic (PLEG): container finished" podID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerID="576e8c21be63fce0137553cf29c23dbe681ec1b232dae2628425c4bb41ef4a12" exitCode=0 Mar 13 10:51:25.500759 master-0 kubenswrapper[17876]: I0313 10:51:25.500672 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" event={"ID":"13790eda-ee5d-4b78-a0d9-3e944cab6f2f","Type":"ContainerDied","Data":"576e8c21be63fce0137553cf29c23dbe681ec1b232dae2628425c4bb41ef4a12"} Mar 13 10:51:25.508074 master-0 kubenswrapper[17876]: I0313 10:51:25.507976 17876 generic.go:334] "Generic (PLEG): container finished" podID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerID="4f9073637f1e22ea16bc57bc046efac8cadee52fb5bf5d45d41f000e064fa5cb" exitCode=0 Mar 13 10:51:25.508074 master-0 kubenswrapper[17876]: I0313 10:51:25.508060 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerDied","Data":"4f9073637f1e22ea16bc57bc046efac8cadee52fb5bf5d45d41f000e064fa5cb"} Mar 13 10:51:26.897734 master-0 kubenswrapper[17876]: I0313 10:51:26.897523 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:26.899773 master-0 kubenswrapper[17876]: I0313 10:51:26.899725 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwzhw\" (UniqueName: \"kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw\") pod \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " Mar 13 10:51:26.903215 master-0 kubenswrapper[17876]: I0313 10:51:26.902954 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw" (OuterVolumeSpecName: "kube-api-access-kwzhw") pod "13790eda-ee5d-4b78-a0d9-3e944cab6f2f" (UID: "13790eda-ee5d-4b78-a0d9-3e944cab6f2f"). InnerVolumeSpecName "kube-api-access-kwzhw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:26.903491 master-0 kubenswrapper[17876]: I0313 10:51:26.903436 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:27.002225 master-0 kubenswrapper[17876]: I0313 10:51:27.001242 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util\") pod \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " Mar 13 10:51:27.002225 master-0 kubenswrapper[17876]: I0313 10:51:27.001337 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle\") pod \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\" (UID: \"13790eda-ee5d-4b78-a0d9-3e944cab6f2f\") " Mar 13 10:51:27.002225 master-0 kubenswrapper[17876]: I0313 10:51:27.002160 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle" (OuterVolumeSpecName: "bundle") pod "13790eda-ee5d-4b78-a0d9-3e944cab6f2f" (UID: "13790eda-ee5d-4b78-a0d9-3e944cab6f2f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:27.002538 master-0 kubenswrapper[17876]: I0313 10:51:27.002402 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util\") pod \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " Mar 13 10:51:27.002938 master-0 kubenswrapper[17876]: I0313 10:51:27.002915 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwzhw\" (UniqueName: \"kubernetes.io/projected/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-kube-api-access-kwzhw\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.002938 master-0 kubenswrapper[17876]: I0313 10:51:27.002933 17876 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.012854 master-0 kubenswrapper[17876]: I0313 10:51:27.012804 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util" (OuterVolumeSpecName: "util") pod "13790eda-ee5d-4b78-a0d9-3e944cab6f2f" (UID: "13790eda-ee5d-4b78-a0d9-3e944cab6f2f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:27.014548 master-0 kubenswrapper[17876]: I0313 10:51:27.014487 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util" (OuterVolumeSpecName: "util") pod "453b6535-0292-4fc0-aa92-c3ba224c0aa0" (UID: "453b6535-0292-4fc0-aa92-c3ba224c0aa0"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:27.104767 master-0 kubenswrapper[17876]: I0313 10:51:27.104595 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle\") pod \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " Mar 13 10:51:27.104767 master-0 kubenswrapper[17876]: I0313 10:51:27.104677 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wt8rq\" (UniqueName: \"kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq\") pod \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\" (UID: \"453b6535-0292-4fc0-aa92-c3ba224c0aa0\") " Mar 13 10:51:27.106398 master-0 kubenswrapper[17876]: I0313 10:51:27.105497 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle" (OuterVolumeSpecName: "bundle") pod "453b6535-0292-4fc0-aa92-c3ba224c0aa0" (UID: "453b6535-0292-4fc0-aa92-c3ba224c0aa0"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:27.106398 master-0 kubenswrapper[17876]: I0313 10:51:27.105559 17876 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-util\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.106398 master-0 kubenswrapper[17876]: I0313 10:51:27.105576 17876 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/13790eda-ee5d-4b78-a0d9-3e944cab6f2f-util\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.109185 master-0 kubenswrapper[17876]: I0313 10:51:27.109149 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq" (OuterVolumeSpecName: "kube-api-access-wt8rq") pod "453b6535-0292-4fc0-aa92-c3ba224c0aa0" (UID: "453b6535-0292-4fc0-aa92-c3ba224c0aa0"). InnerVolumeSpecName "kube-api-access-wt8rq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:27.207007 master-0 kubenswrapper[17876]: I0313 10:51:27.206886 17876 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/453b6535-0292-4fc0-aa92-c3ba224c0aa0-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.207007 master-0 kubenswrapper[17876]: I0313 10:51:27.206943 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wt8rq\" (UniqueName: \"kubernetes.io/projected/453b6535-0292-4fc0-aa92-c3ba224c0aa0-kube-api-access-wt8rq\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:27.524646 master-0 kubenswrapper[17876]: I0313 10:51:27.524587 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" event={"ID":"13790eda-ee5d-4b78-a0d9-3e944cab6f2f","Type":"ContainerDied","Data":"cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d"} Mar 13 10:51:27.524646 master-0 kubenswrapper[17876]: I0313 10:51:27.524621 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5kxf2n" Mar 13 10:51:27.524991 master-0 kubenswrapper[17876]: I0313 10:51:27.524629 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cd76783872b03cbf5cde71389304660df723119db4b899edba2720695bc7141d" Mar 13 10:51:27.531127 master-0 kubenswrapper[17876]: I0313 10:51:27.527409 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" event={"ID":"453b6535-0292-4fc0-aa92-c3ba224c0aa0","Type":"ContainerDied","Data":"fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7"} Mar 13 10:51:27.531127 master-0 kubenswrapper[17876]: I0313 10:51:27.527455 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd49b9d423ac92e372dacca3c1ec370fe59fd798d31d03ce8a26d8cfe0613da7" Mar 13 10:51:27.531127 master-0 kubenswrapper[17876]: I0313 10:51:27.527500 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874vxzd2" Mar 13 10:51:35.076594 master-0 kubenswrapper[17876]: I0313 10:51:35.076522 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs"] Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076785 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076807 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076831 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076837 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076847 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076854 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076862 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076868 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076877 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076883 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076899 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076905 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076919 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076925 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076935 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076941 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="util" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: E0313 10:51:35.076954 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.076960 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="pull" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.077103 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="453b6535-0292-4fc0-aa92-c3ba224c0aa0" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.077119 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="896b3c4a-3958-4269-8689-e132e4110d20" containerName="extract" Mar 13 10:51:35.077315 master-0 kubenswrapper[17876]: I0313 10:51:35.077144 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="13790eda-ee5d-4b78-a0d9-3e944cab6f2f" containerName="extract" Mar 13 10:51:35.078304 master-0 kubenswrapper[17876]: I0313 10:51:35.077627 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" Mar 13 10:51:35.084131 master-0 kubenswrapper[17876]: I0313 10:51:35.082859 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 13 10:51:35.084131 master-0 kubenswrapper[17876]: I0313 10:51:35.083146 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 13 10:51:35.101138 master-0 kubenswrapper[17876]: I0313 10:51:35.100624 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs"] Mar 13 10:51:35.112145 master-0 kubenswrapper[17876]: I0313 10:51:35.109686 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.147143 master-0 kubenswrapper[17876]: I0313 10:51:35.138382 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs"] Mar 13 10:51:35.165088 master-0 kubenswrapper[17876]: I0313 10:51:35.164282 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs"] Mar 13 10:51:35.177932 master-0 kubenswrapper[17876]: I0313 10:51:35.177839 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m68rf\" (UniqueName: \"kubernetes.io/projected/9aa7e0ef-b23f-431f-a89a-3c26675d8d79-kube-api-access-m68rf\") pod \"nmstate-operator-796d4cfff4-lt5fs\" (UID: \"9aa7e0ef-b23f-431f-a89a-3c26675d8d79\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" Mar 13 10:51:35.278991 master-0 kubenswrapper[17876]: I0313 10:51:35.278874 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.278991 master-0 kubenswrapper[17876]: I0313 10:51:35.279004 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.279348 master-0 kubenswrapper[17876]: I0313 10:51:35.279039 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m68rf\" (UniqueName: \"kubernetes.io/projected/9aa7e0ef-b23f-431f-a89a-3c26675d8d79-kube-api-access-m68rf\") pod \"nmstate-operator-796d4cfff4-lt5fs\" (UID: \"9aa7e0ef-b23f-431f-a89a-3c26675d8d79\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" Mar 13 10:51:35.279348 master-0 kubenswrapper[17876]: I0313 10:51:35.279167 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s98c\" (UniqueName: \"kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.295649 master-0 kubenswrapper[17876]: I0313 10:51:35.295597 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m68rf\" (UniqueName: \"kubernetes.io/projected/9aa7e0ef-b23f-431f-a89a-3c26675d8d79-kube-api-access-m68rf\") pod \"nmstate-operator-796d4cfff4-lt5fs\" (UID: \"9aa7e0ef-b23f-431f-a89a-3c26675d8d79\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" Mar 13 10:51:35.381241 master-0 kubenswrapper[17876]: I0313 10:51:35.381163 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s98c\" (UniqueName: \"kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.381604 master-0 kubenswrapper[17876]: I0313 10:51:35.381232 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.381675 master-0 kubenswrapper[17876]: I0313 10:51:35.381634 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.381886 master-0 kubenswrapper[17876]: I0313 10:51:35.381843 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.382122 master-0 kubenswrapper[17876]: I0313 10:51:35.382077 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.399801 master-0 kubenswrapper[17876]: I0313 10:51:35.399735 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s98c\" (UniqueName: \"kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.436119 master-0 kubenswrapper[17876]: I0313 10:51:35.436023 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" Mar 13 10:51:35.469827 master-0 kubenswrapper[17876]: I0313 10:51:35.469761 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:35.861663 master-0 kubenswrapper[17876]: W0313 10:51:35.861613 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9aa7e0ef_b23f_431f_a89a_3c26675d8d79.slice/crio-3feab9dc1c127e262ef1533398d08f4468bf0c08c4ecb30249e39067ec07bfce WatchSource:0}: Error finding container 3feab9dc1c127e262ef1533398d08f4468bf0c08c4ecb30249e39067ec07bfce: Status 404 returned error can't find the container with id 3feab9dc1c127e262ef1533398d08f4468bf0c08c4ecb30249e39067ec07bfce Mar 13 10:51:35.865895 master-0 kubenswrapper[17876]: I0313 10:51:35.865836 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs"] Mar 13 10:51:35.962033 master-0 kubenswrapper[17876]: I0313 10:51:35.961970 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs"] Mar 13 10:51:35.965199 master-0 kubenswrapper[17876]: W0313 10:51:35.964855 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod43c2090a_261f_4da4_9687_140a15d4f02e.slice/crio-13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c WatchSource:0}: Error finding container 13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c: Status 404 returned error can't find the container with id 13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c Mar 13 10:51:36.592205 master-0 kubenswrapper[17876]: I0313 10:51:36.592115 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" event={"ID":"9aa7e0ef-b23f-431f-a89a-3c26675d8d79","Type":"ContainerStarted","Data":"3feab9dc1c127e262ef1533398d08f4468bf0c08c4ecb30249e39067ec07bfce"} Mar 13 10:51:36.593910 master-0 kubenswrapper[17876]: I0313 10:51:36.593867 17876 generic.go:334] "Generic (PLEG): container finished" podID="43c2090a-261f-4da4-9687-140a15d4f02e" containerID="92a9793be487c7cebeef3dd50892a413aa15e4a83433d6e0213593eb865afc48" exitCode=0 Mar 13 10:51:36.594035 master-0 kubenswrapper[17876]: I0313 10:51:36.593919 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" event={"ID":"43c2090a-261f-4da4-9687-140a15d4f02e","Type":"ContainerDied","Data":"92a9793be487c7cebeef3dd50892a413aa15e4a83433d6e0213593eb865afc48"} Mar 13 10:51:36.594035 master-0 kubenswrapper[17876]: I0313 10:51:36.593967 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" event={"ID":"43c2090a-261f-4da4-9687-140a15d4f02e","Type":"ContainerStarted","Data":"13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c"} Mar 13 10:51:39.625859 master-0 kubenswrapper[17876]: I0313 10:51:39.625777 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" event={"ID":"9aa7e0ef-b23f-431f-a89a-3c26675d8d79","Type":"ContainerStarted","Data":"9a1a2088daf7ba158a26ea5bcd61d9c5baa7084ee3d81abc06d4f8b23ac25a87"} Mar 13 10:51:39.628633 master-0 kubenswrapper[17876]: I0313 10:51:39.628580 17876 generic.go:334] "Generic (PLEG): container finished" podID="43c2090a-261f-4da4-9687-140a15d4f02e" containerID="2a9086afcc5a4667e6dc3bcc0fbb00d186d8f118f3b2172de4ae4dc98161e144" exitCode=0 Mar 13 10:51:39.628724 master-0 kubenswrapper[17876]: I0313 10:51:39.628640 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" event={"ID":"43c2090a-261f-4da4-9687-140a15d4f02e","Type":"ContainerDied","Data":"2a9086afcc5a4667e6dc3bcc0fbb00d186d8f118f3b2172de4ae4dc98161e144"} Mar 13 10:51:39.651032 master-0 kubenswrapper[17876]: I0313 10:51:39.650900 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-lt5fs" podStartSLOduration=1.663872612 podStartE2EDuration="4.650874965s" podCreationTimestamp="2026-03-13 10:51:35 +0000 UTC" firstStartedPulling="2026-03-13 10:51:35.863175884 +0000 UTC m=+603.698982360" lastFinishedPulling="2026-03-13 10:51:38.850178237 +0000 UTC m=+606.685984713" observedRunningTime="2026-03-13 10:51:39.643089984 +0000 UTC m=+607.478913530" watchObservedRunningTime="2026-03-13 10:51:39.650874965 +0000 UTC m=+607.486681451" Mar 13 10:51:40.796117 master-0 kubenswrapper[17876]: I0313 10:51:40.795678 17876 generic.go:334] "Generic (PLEG): container finished" podID="43c2090a-261f-4da4-9687-140a15d4f02e" containerID="a503c6aaa8b624d6c912572f69a59154cdcbf272cd603e906eaab09a4dfdabf6" exitCode=0 Mar 13 10:51:40.796747 master-0 kubenswrapper[17876]: I0313 10:51:40.796675 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" event={"ID":"43c2090a-261f-4da4-9687-140a15d4f02e","Type":"ContainerDied","Data":"a503c6aaa8b624d6c912572f69a59154cdcbf272cd603e906eaab09a4dfdabf6"} Mar 13 10:51:40.997986 master-0 kubenswrapper[17876]: I0313 10:51:40.997920 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm"] Mar 13 10:51:40.998874 master-0 kubenswrapper[17876]: I0313 10:51:40.998849 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.000687 master-0 kubenswrapper[17876]: I0313 10:51:41.000662 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 13 10:51:41.000913 master-0 kubenswrapper[17876]: I0313 10:51:41.000711 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 13 10:51:41.010558 master-0 kubenswrapper[17876]: I0313 10:51:41.010474 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm"] Mar 13 10:51:41.131814 master-0 kubenswrapper[17876]: I0313 10:51:41.131748 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1592fbb7-68ed-42ae-89b7-1188ca064e03-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.132036 master-0 kubenswrapper[17876]: I0313 10:51:41.131932 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9rzq\" (UniqueName: \"kubernetes.io/projected/1592fbb7-68ed-42ae-89b7-1188ca064e03-kube-api-access-n9rzq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.232966 master-0 kubenswrapper[17876]: I0313 10:51:41.232902 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9rzq\" (UniqueName: \"kubernetes.io/projected/1592fbb7-68ed-42ae-89b7-1188ca064e03-kube-api-access-n9rzq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.233212 master-0 kubenswrapper[17876]: I0313 10:51:41.233144 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1592fbb7-68ed-42ae-89b7-1188ca064e03-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.233766 master-0 kubenswrapper[17876]: I0313 10:51:41.233727 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/1592fbb7-68ed-42ae-89b7-1188ca064e03-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.259323 master-0 kubenswrapper[17876]: I0313 10:51:41.259278 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9rzq\" (UniqueName: \"kubernetes.io/projected/1592fbb7-68ed-42ae-89b7-1188ca064e03-kube-api-access-n9rzq\") pod \"cert-manager-operator-controller-manager-66c8bdd694-fmfcm\" (UID: \"1592fbb7-68ed-42ae-89b7-1188ca064e03\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:41.321009 master-0 kubenswrapper[17876]: I0313 10:51:41.320915 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" Mar 13 10:51:42.024463 master-0 kubenswrapper[17876]: I0313 10:51:42.013940 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm"] Mar 13 10:51:42.024463 master-0 kubenswrapper[17876]: W0313 10:51:42.019444 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1592fbb7_68ed_42ae_89b7_1188ca064e03.slice/crio-e3fbce570fe81d56bd9202674a4e0fe08474dfabe7f5fe8e10bab9a391496747 WatchSource:0}: Error finding container e3fbce570fe81d56bd9202674a4e0fe08474dfabe7f5fe8e10bab9a391496747: Status 404 returned error can't find the container with id e3fbce570fe81d56bd9202674a4e0fe08474dfabe7f5fe8e10bab9a391496747 Mar 13 10:51:42.315701 master-0 kubenswrapper[17876]: I0313 10:51:42.315666 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:42.365418 master-0 kubenswrapper[17876]: I0313 10:51:42.365338 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util\") pod \"43c2090a-261f-4da4-9687-140a15d4f02e\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " Mar 13 10:51:42.365811 master-0 kubenswrapper[17876]: I0313 10:51:42.365744 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle\") pod \"43c2090a-261f-4da4-9687-140a15d4f02e\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " Mar 13 10:51:42.366735 master-0 kubenswrapper[17876]: I0313 10:51:42.366707 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s98c\" (UniqueName: \"kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c\") pod \"43c2090a-261f-4da4-9687-140a15d4f02e\" (UID: \"43c2090a-261f-4da4-9687-140a15d4f02e\") " Mar 13 10:51:42.370256 master-0 kubenswrapper[17876]: I0313 10:51:42.369865 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle" (OuterVolumeSpecName: "bundle") pod "43c2090a-261f-4da4-9687-140a15d4f02e" (UID: "43c2090a-261f-4da4-9687-140a15d4f02e"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:42.372573 master-0 kubenswrapper[17876]: I0313 10:51:42.372532 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c" (OuterVolumeSpecName: "kube-api-access-9s98c") pod "43c2090a-261f-4da4-9687-140a15d4f02e" (UID: "43c2090a-261f-4da4-9687-140a15d4f02e"). InnerVolumeSpecName "kube-api-access-9s98c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:42.382800 master-0 kubenswrapper[17876]: I0313 10:51:42.382723 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util" (OuterVolumeSpecName: "util") pod "43c2090a-261f-4da4-9687-140a15d4f02e" (UID: "43c2090a-261f-4da4-9687-140a15d4f02e"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 13 10:51:42.475225 master-0 kubenswrapper[17876]: I0313 10:51:42.468833 17876 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-util\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:42.475225 master-0 kubenswrapper[17876]: I0313 10:51:42.468875 17876 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/43c2090a-261f-4da4-9687-140a15d4f02e-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:42.475225 master-0 kubenswrapper[17876]: I0313 10:51:42.468915 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s98c\" (UniqueName: \"kubernetes.io/projected/43c2090a-261f-4da4-9687-140a15d4f02e-kube-api-access-9s98c\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:42.981516 master-0 kubenswrapper[17876]: I0313 10:51:42.980605 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" event={"ID":"1592fbb7-68ed-42ae-89b7-1188ca064e03","Type":"ContainerStarted","Data":"e3fbce570fe81d56bd9202674a4e0fe08474dfabe7f5fe8e10bab9a391496747"} Mar 13 10:51:42.983511 master-0 kubenswrapper[17876]: I0313 10:51:42.983478 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" event={"ID":"43c2090a-261f-4da4-9687-140a15d4f02e","Type":"ContainerDied","Data":"13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c"} Mar 13 10:51:42.983600 master-0 kubenswrapper[17876]: I0313 10:51:42.983517 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13f36d6d05fade5828b9ea943fd617553a2690a33a1b74688eee39e91fa4931c" Mar 13 10:51:42.984382 master-0 kubenswrapper[17876]: I0313 10:51:42.984011 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f08trdqs" Mar 13 10:51:43.639645 master-0 kubenswrapper[17876]: I0313 10:51:43.639578 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb"] Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: E0313 10:51:43.639919 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="util" Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: I0313 10:51:43.639941 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="util" Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: E0313 10:51:43.639986 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="pull" Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: I0313 10:51:43.639992 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="pull" Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: E0313 10:51:43.640003 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="extract" Mar 13 10:51:43.640259 master-0 kubenswrapper[17876]: I0313 10:51:43.640009 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="extract" Mar 13 10:51:43.640519 master-0 kubenswrapper[17876]: I0313 10:51:43.640390 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="43c2090a-261f-4da4-9687-140a15d4f02e" containerName="extract" Mar 13 10:51:43.642173 master-0 kubenswrapper[17876]: I0313 10:51:43.641057 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.645372 master-0 kubenswrapper[17876]: I0313 10:51:43.645331 17876 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 13 10:51:43.651069 master-0 kubenswrapper[17876]: I0313 10:51:43.651027 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 13 10:51:43.651933 master-0 kubenswrapper[17876]: I0313 10:51:43.651908 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 13 10:51:43.652110 master-0 kubenswrapper[17876]: I0313 10:51:43.652084 17876 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 13 10:51:43.653914 master-0 kubenswrapper[17876]: I0313 10:51:43.653877 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb"] Mar 13 10:51:43.786182 master-0 kubenswrapper[17876]: I0313 10:51:43.785633 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-webhook-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.786182 master-0 kubenswrapper[17876]: I0313 10:51:43.785728 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-apiservice-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.786182 master-0 kubenswrapper[17876]: I0313 10:51:43.785757 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2nwj\" (UniqueName: \"kubernetes.io/projected/eb773726-9950-4399-871b-815d20abe38c-kube-api-access-h2nwj\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.887993 master-0 kubenswrapper[17876]: I0313 10:51:43.887923 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-webhook-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.888242 master-0 kubenswrapper[17876]: I0313 10:51:43.888008 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-apiservice-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.888242 master-0 kubenswrapper[17876]: I0313 10:51:43.888034 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2nwj\" (UniqueName: \"kubernetes.io/projected/eb773726-9950-4399-871b-815d20abe38c-kube-api-access-h2nwj\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.896270 master-0 kubenswrapper[17876]: I0313 10:51:43.892403 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-webhook-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:43.896270 master-0 kubenswrapper[17876]: I0313 10:51:43.893264 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/eb773726-9950-4399-871b-815d20abe38c-apiservice-cert\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:44.172166 master-0 kubenswrapper[17876]: I0313 10:51:44.171654 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2nwj\" (UniqueName: \"kubernetes.io/projected/eb773726-9950-4399-871b-815d20abe38c-kube-api-access-h2nwj\") pod \"metallb-operator-controller-manager-57755f98f6-7pnfb\" (UID: \"eb773726-9950-4399-871b-815d20abe38c\") " pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:44.175548 master-0 kubenswrapper[17876]: I0313 10:51:44.175511 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:51:44.175907 master-0 kubenswrapper[17876]: I0313 10:51:44.175834 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="cluster-policy-controller" containerID="cri-o://53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" gracePeriod=30 Mar 13 10:51:44.176049 master-0 kubenswrapper[17876]: I0313 10:51:44.176028 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" containerID="cri-o://30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" gracePeriod=30 Mar 13 10:51:44.176093 master-0 kubenswrapper[17876]: I0313 10:51:44.176085 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" gracePeriod=30 Mar 13 10:51:44.176162 master-0 kubenswrapper[17876]: I0313 10:51:44.176134 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" gracePeriod=30 Mar 13 10:51:44.177231 master-0 kubenswrapper[17876]: I0313 10:51:44.177204 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:51:44.177550 master-0 kubenswrapper[17876]: E0313 10:51:44.177535 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: I0313 10:51:44.177551 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: E0313 10:51:44.177560 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="cluster-policy-controller" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: I0313 10:51:44.177566 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="cluster-policy-controller" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: E0313 10:51:44.177581 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-cert-syncer" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: I0313 10:51:44.177587 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-cert-syncer" Mar 13 10:51:44.177600 master-0 kubenswrapper[17876]: E0313 10:51:44.177599 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-recovery-controller" Mar 13 10:51:44.177768 master-0 kubenswrapper[17876]: I0313 10:51:44.177605 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-recovery-controller" Mar 13 10:51:44.177768 master-0 kubenswrapper[17876]: E0313 10:51:44.177625 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177768 master-0 kubenswrapper[17876]: I0313 10:51:44.177631 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177769 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177783 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177793 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-cert-syncer" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177804 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager-recovery-controller" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177817 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.177853 master-0 kubenswrapper[17876]: I0313 10:51:44.177828 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="cluster-policy-controller" Mar 13 10:51:44.178335 master-0 kubenswrapper[17876]: E0313 10:51:44.178310 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.178335 master-0 kubenswrapper[17876]: I0313 10:51:44.178326 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="e9fc87edb050c91d1c07246e5eb5386e" containerName="kube-controller-manager" Mar 13 10:51:44.290185 master-0 kubenswrapper[17876]: I0313 10:51:44.288814 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:44.295290 master-0 kubenswrapper[17876]: I0313 10:51:44.295234 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.295468 master-0 kubenswrapper[17876]: I0313 10:51:44.295375 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.397962 master-0 kubenswrapper[17876]: I0313 10:51:44.397875 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.398220 master-0 kubenswrapper[17876]: I0313 10:51:44.398044 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.398220 master-0 kubenswrapper[17876]: I0313 10:51:44.398197 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.398289 master-0 kubenswrapper[17876]: I0313 10:51:44.398252 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/d4c95608e26ddbbd2e5890fcd9f507b5-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"d4c95608e26ddbbd2e5890fcd9f507b5\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.424947 master-0 kubenswrapper[17876]: I0313 10:51:44.424782 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/1.log" Mar 13 10:51:44.429781 master-0 kubenswrapper[17876]: I0313 10:51:44.429729 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager-cert-syncer/0.log" Mar 13 10:51:44.430276 master-0 kubenswrapper[17876]: I0313 10:51:44.430229 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:44.438256 master-0 kubenswrapper[17876]: I0313 10:51:44.438152 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="e9fc87edb050c91d1c07246e5eb5386e" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:51:44.603155 master-0 kubenswrapper[17876]: I0313 10:51:44.602940 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir\") pod \"e9fc87edb050c91d1c07246e5eb5386e\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " Mar 13 10:51:44.603479 master-0 kubenswrapper[17876]: I0313 10:51:44.603061 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "e9fc87edb050c91d1c07246e5eb5386e" (UID: "e9fc87edb050c91d1c07246e5eb5386e"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:44.603690 master-0 kubenswrapper[17876]: I0313 10:51:44.603659 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "e9fc87edb050c91d1c07246e5eb5386e" (UID: "e9fc87edb050c91d1c07246e5eb5386e"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:44.606214 master-0 kubenswrapper[17876]: I0313 10:51:44.603794 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir\") pod \"e9fc87edb050c91d1c07246e5eb5386e\" (UID: \"e9fc87edb050c91d1c07246e5eb5386e\") " Mar 13 10:51:44.609843 master-0 kubenswrapper[17876]: I0313 10:51:44.609416 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:44.610131 master-0 kubenswrapper[17876]: I0313 10:51:44.610086 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/e9fc87edb050c91d1c07246e5eb5386e-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:44.795381 master-0 kubenswrapper[17876]: I0313 10:51:44.795151 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb"] Mar 13 10:51:44.801225 master-0 kubenswrapper[17876]: W0313 10:51:44.801129 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb773726_9950_4399_871b_815d20abe38c.slice/crio-6e4fade4e6d4a038f0b769545639ccd34f37fac0db3deff2af08c3081b34bc72 WatchSource:0}: Error finding container 6e4fade4e6d4a038f0b769545639ccd34f37fac0db3deff2af08c3081b34bc72: Status 404 returned error can't find the container with id 6e4fade4e6d4a038f0b769545639ccd34f37fac0db3deff2af08c3081b34bc72 Mar 13 10:51:45.011632 master-0 kubenswrapper[17876]: I0313 10:51:45.011568 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" event={"ID":"eb773726-9950-4399-871b-815d20abe38c","Type":"ContainerStarted","Data":"6e4fade4e6d4a038f0b769545639ccd34f37fac0db3deff2af08c3081b34bc72"} Mar 13 10:51:45.014601 master-0 kubenswrapper[17876]: I0313 10:51:45.014508 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager/1.log" Mar 13 10:51:45.015495 master-0 kubenswrapper[17876]: I0313 10:51:45.015457 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_e9fc87edb050c91d1c07246e5eb5386e/kube-controller-manager-cert-syncer/0.log" Mar 13 10:51:45.015894 master-0 kubenswrapper[17876]: I0313 10:51:45.015862 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" exitCode=0 Mar 13 10:51:45.015955 master-0 kubenswrapper[17876]: I0313 10:51:45.015894 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" exitCode=0 Mar 13 10:51:45.015955 master-0 kubenswrapper[17876]: I0313 10:51:45.015905 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" exitCode=2 Mar 13 10:51:45.015955 master-0 kubenswrapper[17876]: I0313 10:51:45.015915 17876 generic.go:334] "Generic (PLEG): container finished" podID="e9fc87edb050c91d1c07246e5eb5386e" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" exitCode=0 Mar 13 10:51:45.016052 master-0 kubenswrapper[17876]: I0313 10:51:45.015972 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:45.016052 master-0 kubenswrapper[17876]: I0313 10:51:45.015996 17876 scope.go:117] "RemoveContainer" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.018787 master-0 kubenswrapper[17876]: I0313 10:51:45.018748 17876 generic.go:334] "Generic (PLEG): container finished" podID="b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" containerID="4524db8dcbf35e7e123d8271697122d65b897c834b288577fdacdf4a728d7b3d" exitCode=0 Mar 13 10:51:45.018861 master-0 kubenswrapper[17876]: I0313 10:51:45.018789 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c","Type":"ContainerDied","Data":"4524db8dcbf35e7e123d8271697122d65b897c834b288577fdacdf4a728d7b3d"} Mar 13 10:51:45.020320 master-0 kubenswrapper[17876]: I0313 10:51:45.020284 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="e9fc87edb050c91d1c07246e5eb5386e" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:51:45.044588 master-0 kubenswrapper[17876]: I0313 10:51:45.044406 17876 scope.go:117] "RemoveContainer" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.057850 master-0 kubenswrapper[17876]: I0313 10:51:45.057779 17876 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="e9fc87edb050c91d1c07246e5eb5386e" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:51:45.070044 master-0 kubenswrapper[17876]: I0313 10:51:45.069916 17876 scope.go:117] "RemoveContainer" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.129433 master-0 kubenswrapper[17876]: I0313 10:51:45.125183 17876 scope.go:117] "RemoveContainer" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.556845 master-0 kubenswrapper[17876]: I0313 10:51:45.556159 17876 scope.go:117] "RemoveContainer" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.606351 master-0 kubenswrapper[17876]: I0313 10:51:45.602482 17876 scope.go:117] "RemoveContainer" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.606351 master-0 kubenswrapper[17876]: E0313 10:51:45.605053 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": container with ID starting with 30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a not found: ID does not exist" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.606351 master-0 kubenswrapper[17876]: I0313 10:51:45.605116 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a"} err="failed to get container status \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": rpc error: code = NotFound desc = could not find container \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": container with ID starting with 30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a not found: ID does not exist" Mar 13 10:51:45.606351 master-0 kubenswrapper[17876]: I0313 10:51:45.605143 17876 scope.go:117] "RemoveContainer" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: E0313 10:51:45.607454 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": container with ID starting with bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c not found: ID does not exist" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: I0313 10:51:45.607486 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} err="failed to get container status \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": rpc error: code = NotFound desc = could not find container \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": container with ID starting with bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c not found: ID does not exist" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: I0313 10:51:45.607505 17876 scope.go:117] "RemoveContainer" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: E0313 10:51:45.609770 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": container with ID starting with 35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190 not found: ID does not exist" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: I0313 10:51:45.609792 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190"} err="failed to get container status \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": rpc error: code = NotFound desc = could not find container \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": container with ID starting with 35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190 not found: ID does not exist" Mar 13 10:51:45.610125 master-0 kubenswrapper[17876]: I0313 10:51:45.609808 17876 scope.go:117] "RemoveContainer" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.610479 master-0 kubenswrapper[17876]: E0313 10:51:45.610230 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": container with ID starting with b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421 not found: ID does not exist" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.610479 master-0 kubenswrapper[17876]: I0313 10:51:45.610247 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421"} err="failed to get container status \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": rpc error: code = NotFound desc = could not find container \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": container with ID starting with b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421 not found: ID does not exist" Mar 13 10:51:45.610479 master-0 kubenswrapper[17876]: I0313 10:51:45.610261 17876 scope.go:117] "RemoveContainer" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.610580 master-0 kubenswrapper[17876]: E0313 10:51:45.610555 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": container with ID starting with 53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28 not found: ID does not exist" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.610613 master-0 kubenswrapper[17876]: I0313 10:51:45.610571 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28"} err="failed to get container status \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": rpc error: code = NotFound desc = could not find container \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": container with ID starting with 53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28 not found: ID does not exist" Mar 13 10:51:45.610613 master-0 kubenswrapper[17876]: I0313 10:51:45.610598 17876 scope.go:117] "RemoveContainer" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611034 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a"} err="failed to get container status \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": rpc error: code = NotFound desc = could not find container \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": container with ID starting with 30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611057 17876 scope.go:117] "RemoveContainer" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611480 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} err="failed to get container status \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": rpc error: code = NotFound desc = could not find container \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": container with ID starting with bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611499 17876 scope.go:117] "RemoveContainer" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611769 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190"} err="failed to get container status \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": rpc error: code = NotFound desc = could not find container \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": container with ID starting with 35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611781 17876 scope.go:117] "RemoveContainer" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611976 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421"} err="failed to get container status \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": rpc error: code = NotFound desc = could not find container \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": container with ID starting with b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.611989 17876 scope.go:117] "RemoveContainer" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612203 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28"} err="failed to get container status \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": rpc error: code = NotFound desc = could not find container \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": container with ID starting with 53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612219 17876 scope.go:117] "RemoveContainer" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612423 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a"} err="failed to get container status \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": rpc error: code = NotFound desc = could not find container \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": container with ID starting with 30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612452 17876 scope.go:117] "RemoveContainer" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612678 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} err="failed to get container status \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": rpc error: code = NotFound desc = could not find container \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": container with ID starting with bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612691 17876 scope.go:117] "RemoveContainer" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612872 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190"} err="failed to get container status \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": rpc error: code = NotFound desc = could not find container \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": container with ID starting with 35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.612887 17876 scope.go:117] "RemoveContainer" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.613069 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421"} err="failed to get container status \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": rpc error: code = NotFound desc = could not find container \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": container with ID starting with b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.613086 17876 scope.go:117] "RemoveContainer" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.613341 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28"} err="failed to get container status \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": rpc error: code = NotFound desc = could not find container \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": container with ID starting with 53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28 not found: ID does not exist" Mar 13 10:51:45.614123 master-0 kubenswrapper[17876]: I0313 10:51:45.613357 17876 scope.go:117] "RemoveContainer" containerID="30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a" Mar 13 10:51:45.618133 master-0 kubenswrapper[17876]: I0313 10:51:45.617417 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a"} err="failed to get container status \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": rpc error: code = NotFound desc = could not find container \"30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a\": container with ID starting with 30ae95676a659bd34a0e420440f6f8d3233c43678004c39f7ddb1be6ae0bae4a not found: ID does not exist" Mar 13 10:51:45.618133 master-0 kubenswrapper[17876]: I0313 10:51:45.617464 17876 scope.go:117] "RemoveContainer" containerID="bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c" Mar 13 10:51:45.618328 master-0 kubenswrapper[17876]: I0313 10:51:45.618302 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c"} err="failed to get container status \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": rpc error: code = NotFound desc = could not find container \"bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c\": container with ID starting with bd1cb165ecbb0da44e900cc3653f5bab3c4a3b78eaca3bc84cfaae6e83fd992c not found: ID does not exist" Mar 13 10:51:45.618379 master-0 kubenswrapper[17876]: I0313 10:51:45.618364 17876 scope.go:117] "RemoveContainer" containerID="35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190" Mar 13 10:51:45.622124 master-0 kubenswrapper[17876]: I0313 10:51:45.618838 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190"} err="failed to get container status \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": rpc error: code = NotFound desc = could not find container \"35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190\": container with ID starting with 35fb85309349a4245283e8c07131a7aa31bd1e59434822c896459376a444f190 not found: ID does not exist" Mar 13 10:51:45.622124 master-0 kubenswrapper[17876]: I0313 10:51:45.618860 17876 scope.go:117] "RemoveContainer" containerID="b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421" Mar 13 10:51:45.622124 master-0 kubenswrapper[17876]: I0313 10:51:45.620241 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421"} err="failed to get container status \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": rpc error: code = NotFound desc = could not find container \"b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421\": container with ID starting with b8ca0e69d701f942a1c52d323beac095a70061c34dc20f69d3b4af2604eb1421 not found: ID does not exist" Mar 13 10:51:45.622124 master-0 kubenswrapper[17876]: I0313 10:51:45.620300 17876 scope.go:117] "RemoveContainer" containerID="53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28" Mar 13 10:51:45.622124 master-0 kubenswrapper[17876]: I0313 10:51:45.620603 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28"} err="failed to get container status \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": rpc error: code = NotFound desc = could not find container \"53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28\": container with ID starting with 53fd07821da0a8652135724f78f7dfd283c95cb115929734bb8cd7668eae4f28 not found: ID does not exist" Mar 13 10:51:46.509507 master-0 kubenswrapper[17876]: I0313 10:51:46.509254 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e9fc87edb050c91d1c07246e5eb5386e" path="/var/lib/kubelet/pods/e9fc87edb050c91d1c07246e5eb5386e/volumes" Mar 13 10:51:47.248135 master-0 kubenswrapper[17876]: I0313 10:51:47.247790 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.352881 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock\") pod \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.352955 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access\") pod \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.353027 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir\") pod \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\" (UID: \"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c\") " Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.353377 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" (UID: "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.353649 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock" (OuterVolumeSpecName: "var-lock") pod "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" (UID: "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:47.357136 master-0 kubenswrapper[17876]: I0313 10:51:47.356496 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" (UID: "b7fd7f4f-4bd7-48af-8304-e4f74bc1196c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:47.457075 master-0 kubenswrapper[17876]: I0313 10:51:47.457024 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:47.457075 master-0 kubenswrapper[17876]: I0313 10:51:47.457071 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:47.457075 master-0 kubenswrapper[17876]: I0313 10:51:47.457085 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b7fd7f4f-4bd7-48af-8304-e4f74bc1196c-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:48.084527 master-0 kubenswrapper[17876]: I0313 10:51:48.084223 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 13 10:51:48.084527 master-0 kubenswrapper[17876]: I0313 10:51:48.084315 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"b7fd7f4f-4bd7-48af-8304-e4f74bc1196c","Type":"ContainerDied","Data":"d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943"} Mar 13 10:51:48.084527 master-0 kubenswrapper[17876]: I0313 10:51:48.084359 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d1b3e88d41b2a1ef81af86eccb100fb6c7a52f6c57f50e3df38425cf1be7d943" Mar 13 10:51:48.087485 master-0 kubenswrapper[17876]: I0313 10:51:48.086972 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" event={"ID":"1592fbb7-68ed-42ae-89b7-1188ca064e03","Type":"ContainerStarted","Data":"63dcd2c5da45f008bbd1e50bedbee2c19064bf21f889473d57aa383ecadbf606"} Mar 13 10:51:48.115509 master-0 kubenswrapper[17876]: I0313 10:51:48.114452 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-fmfcm" podStartSLOduration=2.802369249 podStartE2EDuration="8.114434675s" podCreationTimestamp="2026-03-13 10:51:40 +0000 UTC" firstStartedPulling="2026-03-13 10:51:42.040433324 +0000 UTC m=+609.876239810" lastFinishedPulling="2026-03-13 10:51:47.35249876 +0000 UTC m=+615.188305236" observedRunningTime="2026-03-13 10:51:48.113658562 +0000 UTC m=+615.949465028" watchObservedRunningTime="2026-03-13 10:51:48.114434675 +0000 UTC m=+615.950241151" Mar 13 10:51:50.610303 master-0 kubenswrapper[17876]: I0313 10:51:50.610232 17876 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:51:50.610845 master-0 kubenswrapper[17876]: I0313 10:51:50.610550 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" gracePeriod=30 Mar 13 10:51:50.610845 master-0 kubenswrapper[17876]: I0313 10:51:50.610713 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" gracePeriod=30 Mar 13 10:51:50.610845 master-0 kubenswrapper[17876]: I0313 10:51:50.610755 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" gracePeriod=30 Mar 13 10:51:50.610845 master-0 kubenswrapper[17876]: I0313 10:51:50.610788 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" gracePeriod=30 Mar 13 10:51:50.610845 master-0 kubenswrapper[17876]: I0313 10:51:50.610822 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" gracePeriod=30 Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616475 17876 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616863 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616880 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616892 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616901 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616910 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616916 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616925 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616930 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616936 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616942 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: E0313 10:51:50.616966 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" containerName="installer" Mar 13 10:51:50.616950 master-0 kubenswrapper[17876]: I0313 10:51:50.616973 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" containerName="installer" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: E0313 10:51:50.616996 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617004 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: E0313 10:51:50.617016 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617022 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: E0313 10:51:50.617032 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617038 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617201 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617219 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7fd7f4f-4bd7-48af-8304-e4f74bc1196c" containerName="installer" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617263 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617287 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617332 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617342 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617350 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617357 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 13 10:51:50.617658 master-0 kubenswrapper[17876]: I0313 10:51:50.617364 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 13 10:51:50.653644 master-0 kubenswrapper[17876]: I0313 10:51:50.653586 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.653644 master-0 kubenswrapper[17876]: I0313 10:51:50.653641 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.653786 master-0 kubenswrapper[17876]: I0313 10:51:50.653663 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.653786 master-0 kubenswrapper[17876]: I0313 10:51:50.653684 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.653786 master-0 kubenswrapper[17876]: I0313 10:51:50.653729 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.653786 master-0 kubenswrapper[17876]: I0313 10:51:50.653769 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.671492 master-0 kubenswrapper[17876]: E0313 10:51:50.671348 17876 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = the connection is draining" event="&Event{ObjectMeta:{etcd-master-0.189c611860f1e540 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.610707776 +0000 UTC m=+618.446514252,LastTimestamp:2026-03-13 10:51:50.610707776 +0000 UTC m=+618.446514252,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:51:50.754825 master-0 kubenswrapper[17876]: I0313 10:51:50.754782 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.754940 master-0 kubenswrapper[17876]: I0313 10:51:50.754855 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.754981 master-0 kubenswrapper[17876]: I0313 10:51:50.754949 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755036 master-0 kubenswrapper[17876]: I0313 10:51:50.755013 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755109 master-0 kubenswrapper[17876]: I0313 10:51:50.755058 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755109 master-0 kubenswrapper[17876]: I0313 10:51:50.755076 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755177 master-0 kubenswrapper[17876]: I0313 10:51:50.755152 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755177 master-0 kubenswrapper[17876]: I0313 10:51:50.755155 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755235 master-0 kubenswrapper[17876]: I0313 10:51:50.755178 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755235 master-0 kubenswrapper[17876]: I0313 10:51:50.755181 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755235 master-0 kubenswrapper[17876]: I0313 10:51:50.755155 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:50.755384 master-0 kubenswrapper[17876]: I0313 10:51:50.755086 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 13 10:51:51.115477 master-0 kubenswrapper[17876]: I0313 10:51:51.115275 17876 generic.go:334] "Generic (PLEG): container finished" podID="de9eb09a-0b9b-4190-b3ce-7eb971c93fae" containerID="1c3f13f13a92f1d9f52dca642a5da02bd099b4378330a2826bf3be767a78a8d7" exitCode=0 Mar 13 10:51:51.115477 master-0 kubenswrapper[17876]: I0313 10:51:51.115378 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-2-master-0" event={"ID":"de9eb09a-0b9b-4190-b3ce-7eb971c93fae","Type":"ContainerDied","Data":"1c3f13f13a92f1d9f52dca642a5da02bd099b4378330a2826bf3be767a78a8d7"} Mar 13 10:51:51.117535 master-0 kubenswrapper[17876]: I0313 10:51:51.117459 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" event={"ID":"eb773726-9950-4399-871b-815d20abe38c","Type":"ContainerStarted","Data":"9890366f54ce2d314cd39cfe9e7abc9c86e070469ca4a969703722e9957d0d82"} Mar 13 10:51:51.117868 master-0 kubenswrapper[17876]: I0313 10:51:51.117664 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:51:51.119902 master-0 kubenswrapper[17876]: I0313 10:51:51.119821 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 10:51:51.120928 master-0 kubenswrapper[17876]: I0313 10:51:51.120844 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 10:51:51.123537 master-0 kubenswrapper[17876]: I0313 10:51:51.123474 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" exitCode=2 Mar 13 10:51:51.123646 master-0 kubenswrapper[17876]: I0313 10:51:51.123547 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" exitCode=0 Mar 13 10:51:51.123646 master-0 kubenswrapper[17876]: I0313 10:51:51.123566 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" exitCode=2 Mar 13 10:51:52.605562 master-0 kubenswrapper[17876]: I0313 10:51:52.605501 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.799998 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock\") pod \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.800272 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access\") pod \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.800395 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir\") pod \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\" (UID: \"de9eb09a-0b9b-4190-b3ce-7eb971c93fae\") " Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.800506 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock" (OuterVolumeSpecName: "var-lock") pod "de9eb09a-0b9b-4190-b3ce-7eb971c93fae" (UID: "de9eb09a-0b9b-4190-b3ce-7eb971c93fae"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.800720 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "de9eb09a-0b9b-4190-b3ce-7eb971c93fae" (UID: "de9eb09a-0b9b-4190-b3ce-7eb971c93fae"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.802019 17876 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:52.802450 master-0 kubenswrapper[17876]: I0313 10:51:52.802067 17876 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:52.806916 master-0 kubenswrapper[17876]: I0313 10:51:52.806820 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "de9eb09a-0b9b-4190-b3ce-7eb971c93fae" (UID: "de9eb09a-0b9b-4190-b3ce-7eb971c93fae"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:51:52.904245 master-0 kubenswrapper[17876]: I0313 10:51:52.904149 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/de9eb09a-0b9b-4190-b3ce-7eb971c93fae-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 13 10:51:53.145972 master-0 kubenswrapper[17876]: I0313 10:51:53.145883 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-retry-2-master-0" event={"ID":"de9eb09a-0b9b-4190-b3ce-7eb971c93fae","Type":"ContainerDied","Data":"f0f1c877867779e63634fae43ed1622dfd8e25252bf72898d372ceee897efc1c"} Mar 13 10:51:53.145972 master-0 kubenswrapper[17876]: I0313 10:51:53.145955 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-retry-2-master-0" Mar 13 10:51:53.145972 master-0 kubenswrapper[17876]: I0313 10:51:53.145972 17876 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f0f1c877867779e63634fae43ed1622dfd8e25252bf72898d372ceee897efc1c" Mar 13 10:51:57.495926 master-0 kubenswrapper[17876]: I0313 10:51:57.495758 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:51:57.510764 master-0 kubenswrapper[17876]: I0313 10:51:57.510715 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:51:57.510764 master-0 kubenswrapper[17876]: I0313 10:51:57.510763 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:52:04.719059 master-0 kubenswrapper[17876]: E0313 10:52:04.718971 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:14.719756 master-0 kubenswrapper[17876]: E0313 10:52:14.719562 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:20.683580 master-0 kubenswrapper[17876]: E0313 10:52:20.683533 17876 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099.scope\": RecentStats: unable to find data in memory cache]" Mar 13 10:52:21.233768 master-0 kubenswrapper[17876]: I0313 10:52:21.233700 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 10:52:21.235557 master-0 kubenswrapper[17876]: I0313 10:52:21.235487 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 10:52:21.236586 master-0 kubenswrapper[17876]: I0313 10:52:21.236536 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 13 10:52:21.237315 master-0 kubenswrapper[17876]: I0313 10:52:21.237269 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 13 10:52:21.239059 master-0 kubenswrapper[17876]: I0313 10:52:21.239003 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:52:21.406268 master-0 kubenswrapper[17876]: I0313 10:52:21.406146 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406293 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406337 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406367 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406457 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406475 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406625 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406629 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406672 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406719 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406490 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.407017 master-0 kubenswrapper[17876]: I0313 10:52:21.406853 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408126 17876 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408172 17876 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408329 17876 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408353 17876 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408378 17876 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.409026 master-0 kubenswrapper[17876]: I0313 10:52:21.408395 17876 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:52:21.513128 master-0 kubenswrapper[17876]: I0313 10:52:21.512973 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 13 10:52:21.514114 master-0 kubenswrapper[17876]: I0313 10:52:21.514075 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 13 10:52:21.514946 master-0 kubenswrapper[17876]: I0313 10:52:21.514930 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 13 10:52:21.515675 master-0 kubenswrapper[17876]: I0313 10:52:21.515639 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 13 10:52:21.517757 master-0 kubenswrapper[17876]: I0313 10:52:21.517663 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" exitCode=137 Mar 13 10:52:21.517860 master-0 kubenswrapper[17876]: I0313 10:52:21.517759 17876 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" exitCode=137 Mar 13 10:52:21.517860 master-0 kubenswrapper[17876]: I0313 10:52:21.517798 17876 scope.go:117] "RemoveContainer" containerID="80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" Mar 13 10:52:21.517958 master-0 kubenswrapper[17876]: I0313 10:52:21.517871 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:52:21.546920 master-0 kubenswrapper[17876]: I0313 10:52:21.546874 17876 scope.go:117] "RemoveContainer" containerID="ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" Mar 13 10:52:21.581034 master-0 kubenswrapper[17876]: I0313 10:52:21.580968 17876 scope.go:117] "RemoveContainer" containerID="098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" Mar 13 10:52:21.602960 master-0 kubenswrapper[17876]: I0313 10:52:21.602891 17876 scope.go:117] "RemoveContainer" containerID="0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" Mar 13 10:52:21.624500 master-0 kubenswrapper[17876]: I0313 10:52:21.624166 17876 scope.go:117] "RemoveContainer" containerID="2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" Mar 13 10:52:21.642160 master-0 kubenswrapper[17876]: I0313 10:52:21.642039 17876 scope.go:117] "RemoveContainer" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" Mar 13 10:52:21.659222 master-0 kubenswrapper[17876]: I0313 10:52:21.659147 17876 scope.go:117] "RemoveContainer" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" Mar 13 10:52:21.681280 master-0 kubenswrapper[17876]: I0313 10:52:21.681213 17876 scope.go:117] "RemoveContainer" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" Mar 13 10:52:21.704655 master-0 kubenswrapper[17876]: I0313 10:52:21.704604 17876 scope.go:117] "RemoveContainer" containerID="80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" Mar 13 10:52:21.705445 master-0 kubenswrapper[17876]: E0313 10:52:21.705394 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47\": container with ID starting with 80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47 not found: ID does not exist" containerID="80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" Mar 13 10:52:21.705519 master-0 kubenswrapper[17876]: I0313 10:52:21.705446 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47"} err="failed to get container status \"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47\": rpc error: code = NotFound desc = could not find container \"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47\": container with ID starting with 80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47 not found: ID does not exist" Mar 13 10:52:21.705519 master-0 kubenswrapper[17876]: I0313 10:52:21.705491 17876 scope.go:117] "RemoveContainer" containerID="ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" Mar 13 10:52:21.706131 master-0 kubenswrapper[17876]: E0313 10:52:21.706069 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62\": container with ID starting with ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62 not found: ID does not exist" containerID="ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" Mar 13 10:52:21.706210 master-0 kubenswrapper[17876]: I0313 10:52:21.706134 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62"} err="failed to get container status \"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62\": rpc error: code = NotFound desc = could not find container \"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62\": container with ID starting with ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62 not found: ID does not exist" Mar 13 10:52:21.706210 master-0 kubenswrapper[17876]: I0313 10:52:21.706164 17876 scope.go:117] "RemoveContainer" containerID="098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" Mar 13 10:52:21.706818 master-0 kubenswrapper[17876]: E0313 10:52:21.706763 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5\": container with ID starting with 098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5 not found: ID does not exist" containerID="098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" Mar 13 10:52:21.706897 master-0 kubenswrapper[17876]: I0313 10:52:21.706814 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5"} err="failed to get container status \"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5\": rpc error: code = NotFound desc = could not find container \"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5\": container with ID starting with 098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5 not found: ID does not exist" Mar 13 10:52:21.706897 master-0 kubenswrapper[17876]: I0313 10:52:21.706830 17876 scope.go:117] "RemoveContainer" containerID="0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" Mar 13 10:52:21.707429 master-0 kubenswrapper[17876]: E0313 10:52:21.707397 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1\": container with ID starting with 0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1 not found: ID does not exist" containerID="0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" Mar 13 10:52:21.707486 master-0 kubenswrapper[17876]: I0313 10:52:21.707427 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1"} err="failed to get container status \"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1\": rpc error: code = NotFound desc = could not find container \"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1\": container with ID starting with 0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1 not found: ID does not exist" Mar 13 10:52:21.707486 master-0 kubenswrapper[17876]: I0313 10:52:21.707444 17876 scope.go:117] "RemoveContainer" containerID="2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" Mar 13 10:52:21.707790 master-0 kubenswrapper[17876]: E0313 10:52:21.707756 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099\": container with ID starting with 2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099 not found: ID does not exist" containerID="2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" Mar 13 10:52:21.707849 master-0 kubenswrapper[17876]: I0313 10:52:21.707778 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099"} err="failed to get container status \"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099\": rpc error: code = NotFound desc = could not find container \"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099\": container with ID starting with 2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099 not found: ID does not exist" Mar 13 10:52:21.707849 master-0 kubenswrapper[17876]: I0313 10:52:21.707808 17876 scope.go:117] "RemoveContainer" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" Mar 13 10:52:21.708169 master-0 kubenswrapper[17876]: E0313 10:52:21.708139 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9\": container with ID starting with fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9 not found: ID does not exist" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" Mar 13 10:52:21.708233 master-0 kubenswrapper[17876]: I0313 10:52:21.708167 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9"} err="failed to get container status \"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9\": rpc error: code = NotFound desc = could not find container \"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9\": container with ID starting with fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9 not found: ID does not exist" Mar 13 10:52:21.708233 master-0 kubenswrapper[17876]: I0313 10:52:21.708183 17876 scope.go:117] "RemoveContainer" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" Mar 13 10:52:21.708492 master-0 kubenswrapper[17876]: E0313 10:52:21.708461 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f\": container with ID starting with 7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f not found: ID does not exist" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" Mar 13 10:52:21.708492 master-0 kubenswrapper[17876]: I0313 10:52:21.708484 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f"} err="failed to get container status \"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f\": rpc error: code = NotFound desc = could not find container \"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f\": container with ID starting with 7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f not found: ID does not exist" Mar 13 10:52:21.708612 master-0 kubenswrapper[17876]: I0313 10:52:21.708514 17876 scope.go:117] "RemoveContainer" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" Mar 13 10:52:21.708954 master-0 kubenswrapper[17876]: E0313 10:52:21.708921 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f\": container with ID starting with a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f not found: ID does not exist" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" Mar 13 10:52:21.709011 master-0 kubenswrapper[17876]: I0313 10:52:21.708959 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f"} err="failed to get container status \"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f\": rpc error: code = NotFound desc = could not find container \"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f\": container with ID starting with a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f not found: ID does not exist" Mar 13 10:52:21.709011 master-0 kubenswrapper[17876]: I0313 10:52:21.708973 17876 scope.go:117] "RemoveContainer" containerID="80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47" Mar 13 10:52:21.709363 master-0 kubenswrapper[17876]: I0313 10:52:21.709305 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47"} err="failed to get container status \"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47\": rpc error: code = NotFound desc = could not find container \"80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47\": container with ID starting with 80144b39b4d76bbe5ffe46bc4aa18256642d8cf4169d504aeb4ed547ab21ee47 not found: ID does not exist" Mar 13 10:52:21.709363 master-0 kubenswrapper[17876]: I0313 10:52:21.709361 17876 scope.go:117] "RemoveContainer" containerID="ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62" Mar 13 10:52:21.709697 master-0 kubenswrapper[17876]: I0313 10:52:21.709664 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62"} err="failed to get container status \"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62\": rpc error: code = NotFound desc = could not find container \"ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62\": container with ID starting with ebb0a43d717cd862ceecc14bb91fe57bd1273a785e9bbbddb322a6b58f958a62 not found: ID does not exist" Mar 13 10:52:21.709697 master-0 kubenswrapper[17876]: I0313 10:52:21.709686 17876 scope.go:117] "RemoveContainer" containerID="098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5" Mar 13 10:52:21.710022 master-0 kubenswrapper[17876]: I0313 10:52:21.709989 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5"} err="failed to get container status \"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5\": rpc error: code = NotFound desc = could not find container \"098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5\": container with ID starting with 098816e75f18d38c265c0f3e69f26f90aa123fbb04c0e1ca357dc6ad42c4f5a5 not found: ID does not exist" Mar 13 10:52:21.710079 master-0 kubenswrapper[17876]: I0313 10:52:21.710024 17876 scope.go:117] "RemoveContainer" containerID="0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1" Mar 13 10:52:21.710340 master-0 kubenswrapper[17876]: I0313 10:52:21.710298 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1"} err="failed to get container status \"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1\": rpc error: code = NotFound desc = could not find container \"0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1\": container with ID starting with 0fbe481fb92a3c8585c4d7e93a70a878312d5fa1b3d02918e5b89d8de94a2ec1 not found: ID does not exist" Mar 13 10:52:21.710340 master-0 kubenswrapper[17876]: I0313 10:52:21.710328 17876 scope.go:117] "RemoveContainer" containerID="2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099" Mar 13 10:52:21.710679 master-0 kubenswrapper[17876]: I0313 10:52:21.710646 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099"} err="failed to get container status \"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099\": rpc error: code = NotFound desc = could not find container \"2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099\": container with ID starting with 2335645113a0a10ca4bc6cbde4d4a8b5afa2c9c230c9977192705a25a35ae099 not found: ID does not exist" Mar 13 10:52:21.710679 master-0 kubenswrapper[17876]: I0313 10:52:21.710665 17876 scope.go:117] "RemoveContainer" containerID="fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9" Mar 13 10:52:21.710960 master-0 kubenswrapper[17876]: I0313 10:52:21.710921 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9"} err="failed to get container status \"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9\": rpc error: code = NotFound desc = could not find container \"fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9\": container with ID starting with fd01d4f9c4063dd13fbca473a909f42f6a88fd650c1fffa8cbe3f920accc2cf9 not found: ID does not exist" Mar 13 10:52:21.710960 master-0 kubenswrapper[17876]: I0313 10:52:21.710947 17876 scope.go:117] "RemoveContainer" containerID="7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f" Mar 13 10:52:21.711295 master-0 kubenswrapper[17876]: I0313 10:52:21.711253 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f"} err="failed to get container status \"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f\": rpc error: code = NotFound desc = could not find container \"7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f\": container with ID starting with 7a3bacea71d780fde4c4a603a7e6ca1f27581814c6239f3c78187ba1a7f4ae6f not found: ID does not exist" Mar 13 10:52:21.711295 master-0 kubenswrapper[17876]: I0313 10:52:21.711278 17876 scope.go:117] "RemoveContainer" containerID="a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f" Mar 13 10:52:21.711783 master-0 kubenswrapper[17876]: I0313 10:52:21.711737 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f"} err="failed to get container status \"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f\": rpc error: code = NotFound desc = could not find container \"a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f\": container with ID starting with a675d30662f5c9211953f6ecf6c93a6b8b845989324b87eedd80d838aff7899f not found: ID does not exist" Mar 13 10:52:22.505808 master-0 kubenswrapper[17876]: I0313 10:52:22.505758 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 13 10:52:24.294500 master-0 kubenswrapper[17876]: I0313 10:52:24.294366 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" Mar 13 10:52:24.675708 master-0 kubenswrapper[17876]: E0313 10:52:24.675379 17876 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c611860f28d61 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.610750817 +0000 UTC m=+618.446557293,LastTimestamp:2026-03-13 10:51:50.610750817 +0000 UTC m=+618.446557293,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:52:24.720497 master-0 kubenswrapper[17876]: E0313 10:52:24.720387 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:26.494043 master-0 kubenswrapper[17876]: I0313 10:52:26.493944 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:52:26.514279 master-0 kubenswrapper[17876]: I0313 10:52:26.514214 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:52:26.514279 master-0 kubenswrapper[17876]: I0313 10:52:26.514266 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:52:31.513805 master-0 kubenswrapper[17876]: E0313 10:52:31.513729 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:31.514544 master-0 kubenswrapper[17876]: I0313 10:52:31.514508 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:31.545156 master-0 kubenswrapper[17876]: W0313 10:52:31.544845 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4c95608e26ddbbd2e5890fcd9f507b5.slice/crio-902f52dc31ecf437faaa41985c95041e1816976bd44289fca44dcb9a2094aa74 WatchSource:0}: Error finding container 902f52dc31ecf437faaa41985c95041e1816976bd44289fca44dcb9a2094aa74: Status 404 returned error can't find the container with id 902f52dc31ecf437faaa41985c95041e1816976bd44289fca44dcb9a2094aa74 Mar 13 10:52:31.615121 master-0 kubenswrapper[17876]: I0313 10:52:31.615012 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"902f52dc31ecf437faaa41985c95041e1816976bd44289fca44dcb9a2094aa74"} Mar 13 10:52:32.632578 master-0 kubenswrapper[17876]: I0313 10:52:32.632523 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"74505b52ff931df71f09020188bdf8ab96f8ccda04f798cecce42b6d998c19f4"} Mar 13 10:52:32.632578 master-0 kubenswrapper[17876]: I0313 10:52:32.632580 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"9402e6bad33074f5d9313e72f6dee9925c9e58b9950fe88ca39c8cdf42870a9f"} Mar 13 10:52:32.633231 master-0 kubenswrapper[17876]: I0313 10:52:32.632592 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79"} Mar 13 10:52:32.633231 master-0 kubenswrapper[17876]: I0313 10:52:32.632602 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"a1c6cb0ea580ee2e5a0163e45679098d78a3a050a25ce220ae2f9cbfc810bd08"} Mar 13 10:52:32.633231 master-0 kubenswrapper[17876]: I0313 10:52:32.632991 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:52:32.633231 master-0 kubenswrapper[17876]: I0313 10:52:32.633031 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:52:34.720762 master-0 kubenswrapper[17876]: E0313 10:52:34.720631 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" Mar 13 10:52:41.515021 master-0 kubenswrapper[17876]: I0313 10:52:41.514923 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:41.515021 master-0 kubenswrapper[17876]: I0313 10:52:41.515005 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:41.515021 master-0 kubenswrapper[17876]: I0313 10:52:41.515020 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:41.515021 master-0 kubenswrapper[17876]: I0313 10:52:41.515034 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:41.519406 master-0 kubenswrapper[17876]: I0313 10:52:41.519349 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:44.515696 master-0 kubenswrapper[17876]: I0313 10:52:44.515614 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:52:44.516203 master-0 kubenswrapper[17876]: I0313 10:52:44.515715 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:44.721550 master-0 kubenswrapper[17876]: E0313 10:52:44.721452 17876 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:44.721550 master-0 kubenswrapper[17876]: I0313 10:52:44.721545 17876 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 13 10:52:45.343308 master-0 kubenswrapper[17876]: E0313 10:52:45.343189 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:52:35Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:52:35Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:52:35Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:52:35Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:46.778695 master-0 kubenswrapper[17876]: I0313 10:52:46.778639 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/1.log" Mar 13 10:52:46.780229 master-0 kubenswrapper[17876]: I0313 10:52:46.780157 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/0.log" Mar 13 10:52:46.780950 master-0 kubenswrapper[17876]: I0313 10:52:46.780881 17876 generic.go:334] "Generic (PLEG): container finished" podID="a3c91eef-ec46-419f-b418-ac3a8094b77d" containerID="4306aa93623283fa1e756de36acf9fe639a1c8b92b5741ac2b1dc315689b3cc6" exitCode=1 Mar 13 10:52:46.781018 master-0 kubenswrapper[17876]: I0313 10:52:46.780954 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerDied","Data":"4306aa93623283fa1e756de36acf9fe639a1c8b92b5741ac2b1dc315689b3cc6"} Mar 13 10:52:46.781073 master-0 kubenswrapper[17876]: I0313 10:52:46.781016 17876 scope.go:117] "RemoveContainer" containerID="d549e33454132cb59d35aa82f54081df02e47c5f25713ca9aa9235feadd56248" Mar 13 10:52:46.782209 master-0 kubenswrapper[17876]: I0313 10:52:46.782162 17876 scope.go:117] "RemoveContainer" containerID="4306aa93623283fa1e756de36acf9fe639a1c8b92b5741ac2b1dc315689b3cc6" Mar 13 10:52:47.790733 master-0 kubenswrapper[17876]: I0313 10:52:47.790648 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-hkjrg_a3c91eef-ec46-419f-b418-ac3a8094b77d/approver/1.log" Mar 13 10:52:47.791223 master-0 kubenswrapper[17876]: I0313 10:52:47.790965 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-hkjrg" event={"ID":"a3c91eef-ec46-419f-b418-ac3a8094b77d","Type":"ContainerStarted","Data":"d45c17de205a7942431a6767c9156b3d684d44643738829dd8cb109c56c84a3d"} Mar 13 10:52:51.118909 master-0 kubenswrapper[17876]: I0313 10:52:51.118814 17876 status_manager.go:851] "Failed to get status for pod" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" pod="openshift-etcd/etcd-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods etcd-master-0)" Mar 13 10:52:51.523195 master-0 kubenswrapper[17876]: I0313 10:52:51.523081 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:52:54.515870 master-0 kubenswrapper[17876]: I0313 10:52:54.515710 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:52:54.515870 master-0 kubenswrapper[17876]: I0313 10:52:54.515847 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:54.722885 master-0 kubenswrapper[17876]: E0313 10:52:54.722759 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 13 10:52:55.344232 master-0 kubenswrapper[17876]: E0313 10:52:55.344150 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:52:58.680025 master-0 kubenswrapper[17876]: E0313 10:52:58.679763 17876 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c611860f30de6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.610783718 +0000 UTC m=+618.446590204,LastTimestamp:2026-03-13 10:51:50.610783718 +0000 UTC m=+618.446590204,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:53:00.583430 master-0 kubenswrapper[17876]: E0313 10:53:00.583377 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 10:53:00.584352 master-0 kubenswrapper[17876]: I0313 10:53:00.584327 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 13 10:53:00.614692 master-0 kubenswrapper[17876]: W0313 10:53:00.614640 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-7fd7f2eed1ada5df309ce5cf76c26f732c7b4db891cf4e18781411db5d12731a WatchSource:0}: Error finding container 7fd7f2eed1ada5df309ce5cf76c26f732c7b4db891cf4e18781411db5d12731a: Status 404 returned error can't find the container with id 7fd7f2eed1ada5df309ce5cf76c26f732c7b4db891cf4e18781411db5d12731a Mar 13 10:53:01.025399 master-0 kubenswrapper[17876]: I0313 10:53:01.025339 17876 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="6b13eae775734c33146e4799ad8bab0af75eb61762eedcadea93f03d6ff9ad66" exitCode=0 Mar 13 10:53:01.025399 master-0 kubenswrapper[17876]: I0313 10:53:01.025396 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"6b13eae775734c33146e4799ad8bab0af75eb61762eedcadea93f03d6ff9ad66"} Mar 13 10:53:01.025703 master-0 kubenswrapper[17876]: I0313 10:53:01.025428 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"7fd7f2eed1ada5df309ce5cf76c26f732c7b4db891cf4e18781411db5d12731a"} Mar 13 10:53:01.025703 master-0 kubenswrapper[17876]: I0313 10:53:01.025675 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:53:01.025703 master-0 kubenswrapper[17876]: I0313 10:53:01.025687 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:53:02.506398 master-0 kubenswrapper[17876]: I0313 10:53:02.506308 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:56616->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 13 10:53:02.507339 master-0 kubenswrapper[17876]: I0313 10:53:02.506882 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:56616->127.0.0.1:10357: read: connection reset by peer" Mar 13 10:53:02.512029 master-0 kubenswrapper[17876]: I0313 10:53:02.511967 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:03.040459 master-0 kubenswrapper[17876]: I0313 10:53:03.040402 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/0.log" Mar 13 10:53:03.040884 master-0 kubenswrapper[17876]: I0313 10:53:03.040839 17876 generic.go:334] "Generic (PLEG): container finished" podID="d4c95608e26ddbbd2e5890fcd9f507b5" containerID="bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79" exitCode=255 Mar 13 10:53:03.040974 master-0 kubenswrapper[17876]: I0313 10:53:03.040883 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerDied","Data":"bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79"} Mar 13 10:53:04.924060 master-0 kubenswrapper[17876]: E0313 10:53:04.923954 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 13 10:53:05.345070 master-0 kubenswrapper[17876]: E0313 10:53:05.344513 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:06.636992 master-0 kubenswrapper[17876]: E0313 10:53:06.636908 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:06.637761 master-0 kubenswrapper[17876]: I0313 10:53:06.637712 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 10:53:06.638147 master-0 kubenswrapper[17876]: I0313 10:53:06.638017 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" containerID="cri-o://bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79" gracePeriod=30 Mar 13 10:53:07.350818 master-0 kubenswrapper[17876]: I0313 10:53:07.350760 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/0.log" Mar 13 10:53:07.351133 master-0 kubenswrapper[17876]: I0313 10:53:07.351070 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5"} Mar 13 10:53:07.351450 master-0 kubenswrapper[17876]: I0313 10:53:07.351406 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:53:07.351450 master-0 kubenswrapper[17876]: I0313 10:53:07.351432 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:53:11.515206 master-0 kubenswrapper[17876]: I0313 10:53:11.514999 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:11.515206 master-0 kubenswrapper[17876]: I0313 10:53:11.515141 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:14.515925 master-0 kubenswrapper[17876]: I0313 10:53:14.515833 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:53:14.516602 master-0 kubenswrapper[17876]: I0313 10:53:14.515941 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:15.325168 master-0 kubenswrapper[17876]: E0313 10:53:15.325024 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 13 10:53:15.345648 master-0 kubenswrapper[17876]: E0313 10:53:15.345525 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:24.515810 master-0 kubenswrapper[17876]: I0313 10:53:24.515670 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:53:24.515810 master-0 kubenswrapper[17876]: I0313 10:53:24.515784 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:25.346074 master-0 kubenswrapper[17876]: E0313 10:53:25.345979 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:25.346074 master-0 kubenswrapper[17876]: E0313 10:53:25.346030 17876 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 13 10:53:26.126720 master-0 kubenswrapper[17876]: E0313 10:53:26.126611 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 13 10:53:32.684049 master-0 kubenswrapper[17876]: E0313 10:53:32.683757 17876 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189c611860f3934d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.610817869 +0000 UTC m=+618.446624345,LastTimestamp:2026-03-13 10:51:50.610817869 +0000 UTC m=+618.446624345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:53:34.516248 master-0 kubenswrapper[17876]: I0313 10:53:34.516183 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:53:34.516768 master-0 kubenswrapper[17876]: I0313 10:53:34.516246 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:34.516768 master-0 kubenswrapper[17876]: I0313 10:53:34.516313 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:35.029563 master-0 kubenswrapper[17876]: E0313 10:53:35.029423 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 10:53:35.767497 master-0 kubenswrapper[17876]: I0313 10:53:35.767407 17876 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="ef42acb843b6f1acbbe14dfd171b13483b08b189047bc2a0a4ec2f8f82e1f9da" exitCode=0 Mar 13 10:53:35.767497 master-0 kubenswrapper[17876]: I0313 10:53:35.767493 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"ef42acb843b6f1acbbe14dfd171b13483b08b189047bc2a0a4ec2f8f82e1f9da"} Mar 13 10:53:35.768515 master-0 kubenswrapper[17876]: I0313 10:53:35.767972 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:53:35.768515 master-0 kubenswrapper[17876]: I0313 10:53:35.768009 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:53:37.729055 master-0 kubenswrapper[17876]: E0313 10:53:37.728557 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 13 10:53:37.785510 master-0 kubenswrapper[17876]: I0313 10:53:37.785471 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/1.log" Mar 13 10:53:37.787109 master-0 kubenswrapper[17876]: I0313 10:53:37.787046 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/0.log" Mar 13 10:53:37.787581 master-0 kubenswrapper[17876]: I0313 10:53:37.787549 17876 generic.go:334] "Generic (PLEG): container finished" podID="d4c95608e26ddbbd2e5890fcd9f507b5" containerID="a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5" exitCode=255 Mar 13 10:53:37.787706 master-0 kubenswrapper[17876]: I0313 10:53:37.787602 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerDied","Data":"a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5"} Mar 13 10:53:37.787841 master-0 kubenswrapper[17876]: I0313 10:53:37.787825 17876 scope.go:117] "RemoveContainer" containerID="bf58bacbb76fb3c8d19de9e3379132c3e60ea71f581a2962f370768e75126e79" Mar 13 10:53:38.800152 master-0 kubenswrapper[17876]: I0313 10:53:38.799942 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/1.log" Mar 13 10:53:38.805391 master-0 kubenswrapper[17876]: I0313 10:53:38.805318 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/0.log" Mar 13 10:53:38.806144 master-0 kubenswrapper[17876]: I0313 10:53:38.806064 17876 generic.go:334] "Generic (PLEG): container finished" podID="db9faadf-74e9-4a7f-b3a6-902dd14ac978" containerID="c14288f5668e235056cc67c66c8553579053cff3b8159a0ec2c339bf75712609" exitCode=1 Mar 13 10:53:38.806244 master-0 kubenswrapper[17876]: I0313 10:53:38.806175 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerDied","Data":"c14288f5668e235056cc67c66c8553579053cff3b8159a0ec2c339bf75712609"} Mar 13 10:53:38.806244 master-0 kubenswrapper[17876]: I0313 10:53:38.806239 17876 scope.go:117] "RemoveContainer" containerID="84ed6fae08bf4a492c0a06628d17fed3556bf3cf0fb6950b3ee1afcbd54dfc1c" Mar 13 10:53:38.806861 master-0 kubenswrapper[17876]: I0313 10:53:38.806822 17876 scope.go:117] "RemoveContainer" containerID="c14288f5668e235056cc67c66c8553579053cff3b8159a0ec2c339bf75712609" Mar 13 10:53:38.811979 master-0 kubenswrapper[17876]: I0313 10:53:38.811819 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/1.log" Mar 13 10:53:39.825566 master-0 kubenswrapper[17876]: I0313 10:53:39.825504 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-657wt_db9faadf-74e9-4a7f-b3a6-902dd14ac978/manager/1.log" Mar 13 10:53:39.826518 master-0 kubenswrapper[17876]: I0313 10:53:39.826470 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" event={"ID":"db9faadf-74e9-4a7f-b3a6-902dd14ac978","Type":"ContainerStarted","Data":"938274705a3cc52e0fbf1190ffb26ef56e4d299f54a271b378a8bc683898aec4"} Mar 13 10:53:39.827390 master-0 kubenswrapper[17876]: I0313 10:53:39.827314 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:53:40.840460 master-0 kubenswrapper[17876]: I0313 10:53:40.840382 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/1.log" Mar 13 10:53:40.842050 master-0 kubenswrapper[17876]: I0313 10:53:40.841984 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/0.log" Mar 13 10:53:40.842180 master-0 kubenswrapper[17876]: I0313 10:53:40.842082 17876 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2" exitCode=1 Mar 13 10:53:40.842254 master-0 kubenswrapper[17876]: I0313 10:53:40.842181 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2"} Mar 13 10:53:40.842323 master-0 kubenswrapper[17876]: I0313 10:53:40.842279 17876 scope.go:117] "RemoveContainer" containerID="65303d479992d7eac3c67c36b8aaff361e114ce77094761d7640db6355190c8e" Mar 13 10:53:40.843196 master-0 kubenswrapper[17876]: I0313 10:53:40.843133 17876 scope.go:117] "RemoveContainer" containerID="1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2" Mar 13 10:53:41.353852 master-0 kubenswrapper[17876]: E0313 10:53:41.353766 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:41.354448 master-0 kubenswrapper[17876]: I0313 10:53:41.354401 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 10:53:41.354622 master-0 kubenswrapper[17876]: I0313 10:53:41.354583 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" containerID="cri-o://a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5" gracePeriod=30 Mar 13 10:53:41.853608 master-0 kubenswrapper[17876]: I0313 10:53:41.853457 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/1.log" Mar 13 10:53:41.854188 master-0 kubenswrapper[17876]: I0313 10:53:41.853754 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb"} Mar 13 10:53:41.855823 master-0 kubenswrapper[17876]: I0313 10:53:41.855789 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/1.log" Mar 13 10:53:41.857125 master-0 kubenswrapper[17876]: I0313 10:53:41.857007 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a"} Mar 13 10:53:41.857525 master-0 kubenswrapper[17876]: I0313 10:53:41.857454 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:53:41.857580 master-0 kubenswrapper[17876]: I0313 10:53:41.857529 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:53:42.876658 master-0 kubenswrapper[17876]: I0313 10:53:42.876594 17876 generic.go:334] "Generic (PLEG): container finished" podID="1ef32245-c238-43c6-a57a-a5ac95aff1f7" containerID="7f44cac9d59c9752582d0c710ae74baa24a3adcc9cd398ea6e5fd9c8a59527e5" exitCode=0 Mar 13 10:53:42.876658 master-0 kubenswrapper[17876]: I0313 10:53:42.876658 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerDied","Data":"7f44cac9d59c9752582d0c710ae74baa24a3adcc9cd398ea6e5fd9c8a59527e5"} Mar 13 10:53:42.877418 master-0 kubenswrapper[17876]: I0313 10:53:42.876750 17876 scope.go:117] "RemoveContainer" containerID="a91f7cc014bcb325926843367389352ca03fb235615d46451a4baa8a7058522f" Mar 13 10:53:42.877484 master-0 kubenswrapper[17876]: I0313 10:53:42.877464 17876 scope.go:117] "RemoveContainer" containerID="7f44cac9d59c9752582d0c710ae74baa24a3adcc9cd398ea6e5fd9c8a59527e5" Mar 13 10:53:42.880921 master-0 kubenswrapper[17876]: I0313 10:53:42.880857 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/config-sync-controllers/0.log" Mar 13 10:53:42.881676 master-0 kubenswrapper[17876]: I0313 10:53:42.881597 17876 generic.go:334] "Generic (PLEG): container finished" podID="b7090328-1191-4c7c-afed-603d7333014f" containerID="b449d051473ff9974acc080b10607f0bdeb8e4b0dbbbfc4c1bde4f8d09a30cfb" exitCode=1 Mar 13 10:53:42.881788 master-0 kubenswrapper[17876]: I0313 10:53:42.881699 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerDied","Data":"b449d051473ff9974acc080b10607f0bdeb8e4b0dbbbfc4c1bde4f8d09a30cfb"} Mar 13 10:53:42.882522 master-0 kubenswrapper[17876]: I0313 10:53:42.882478 17876 scope.go:117] "RemoveContainer" containerID="b449d051473ff9974acc080b10607f0bdeb8e4b0dbbbfc4c1bde4f8d09a30cfb" Mar 13 10:53:43.897377 master-0 kubenswrapper[17876]: I0313 10:53:43.897285 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" event={"ID":"1ef32245-c238-43c6-a57a-a5ac95aff1f7","Type":"ContainerStarted","Data":"a6443d558168eb29c7900515927e3078011e8c18c94c8460be114015e33cf9ce"} Mar 13 10:53:43.898477 master-0 kubenswrapper[17876]: I0313 10:53:43.897752 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:53:43.901642 master-0 kubenswrapper[17876]: I0313 10:53:43.901422 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-4v99n" Mar 13 10:53:43.902180 master-0 kubenswrapper[17876]: I0313 10:53:43.902055 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/config-sync-controllers/0.log" Mar 13 10:53:43.903172 master-0 kubenswrapper[17876]: I0313 10:53:43.902839 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"123f0aa611a604ea12ca4f9130258ef8e36b35c7f19586dd48c9dabbd2524804"} Mar 13 10:53:48.007140 master-0 kubenswrapper[17876]: I0313 10:53:48.007043 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-657wt" Mar 13 10:53:50.941707 master-0 kubenswrapper[17876]: E0313 10:53:50.941500 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 13 10:53:51.120849 master-0 kubenswrapper[17876]: I0313 10:53:51.120717 17876 status_manager.go:851] "Failed to get status for pod" podUID="de9eb09a-0b9b-4190-b3ce-7eb971c93fae" pod="openshift-etcd/installer-2-retry-2-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-2-retry-2-master-0)" Mar 13 10:53:51.515073 master-0 kubenswrapper[17876]: I0313 10:53:51.514989 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:51.515418 master-0 kubenswrapper[17876]: I0313 10:53:51.515168 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:53:54.515990 master-0 kubenswrapper[17876]: I0313 10:53:54.515823 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:53:54.515990 master-0 kubenswrapper[17876]: I0313 10:53:54.515952 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:53:59.155691 master-0 kubenswrapper[17876]: I0313 10:53:59.155622 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/config-sync-controllers/0.log" Mar 13 10:53:59.156321 master-0 kubenswrapper[17876]: I0313 10:53:59.156245 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/cluster-cloud-controller-manager/0.log" Mar 13 10:53:59.156321 master-0 kubenswrapper[17876]: I0313 10:53:59.156299 17876 generic.go:334] "Generic (PLEG): container finished" podID="b7090328-1191-4c7c-afed-603d7333014f" containerID="6b457fca38abf31ca20d44610b680f150e7060cd35d43f544ed341cc62e726d2" exitCode=1 Mar 13 10:53:59.156418 master-0 kubenswrapper[17876]: I0313 10:53:59.156350 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerDied","Data":"6b457fca38abf31ca20d44610b680f150e7060cd35d43f544ed341cc62e726d2"} Mar 13 10:53:59.156959 master-0 kubenswrapper[17876]: I0313 10:53:59.156928 17876 scope.go:117] "RemoveContainer" containerID="6b457fca38abf31ca20d44610b680f150e7060cd35d43f544ed341cc62e726d2" Mar 13 10:54:00.173350 master-0 kubenswrapper[17876]: I0313 10:54:00.173217 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/config-sync-controllers/0.log" Mar 13 10:54:00.174837 master-0 kubenswrapper[17876]: I0313 10:54:00.174171 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n_b7090328-1191-4c7c-afed-603d7333014f/cluster-cloud-controller-manager/0.log" Mar 13 10:54:00.174837 master-0 kubenswrapper[17876]: I0313 10:54:00.174279 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-w4d4n" event={"ID":"b7090328-1191-4c7c-afed-603d7333014f","Type":"ContainerStarted","Data":"a3720f37c2836030e89da2a923d5894aabd5c585158aa33ed036f29d5ece092e"} Mar 13 10:54:04.273533 master-0 kubenswrapper[17876]: I0313 10:54:04.273456 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/1.log" Mar 13 10:54:04.279580 master-0 kubenswrapper[17876]: I0313 10:54:04.279516 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/0.log" Mar 13 10:54:04.279769 master-0 kubenswrapper[17876]: I0313 10:54:04.279621 17876 generic.go:334] "Generic (PLEG): container finished" podID="ec33c506-8abe-4659-84d3-a294c31b446c" containerID="b6607de7f8444878291cce041e89b284e3fdfa07de1c40770b98ee1612cc8d65" exitCode=1 Mar 13 10:54:04.279769 master-0 kubenswrapper[17876]: I0313 10:54:04.279695 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerDied","Data":"b6607de7f8444878291cce041e89b284e3fdfa07de1c40770b98ee1612cc8d65"} Mar 13 10:54:04.279918 master-0 kubenswrapper[17876]: I0313 10:54:04.279780 17876 scope.go:117] "RemoveContainer" containerID="eef8df0e8104fd7c100ce9287ca728a8ffd7fa03eb81ac77feb69da88983a946" Mar 13 10:54:04.281410 master-0 kubenswrapper[17876]: I0313 10:54:04.281339 17876 scope.go:117] "RemoveContainer" containerID="b6607de7f8444878291cce041e89b284e3fdfa07de1c40770b98ee1612cc8d65" Mar 13 10:54:04.515486 master-0 kubenswrapper[17876]: I0313 10:54:04.515381 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:54:04.515646 master-0 kubenswrapper[17876]: I0313 10:54:04.515505 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:05.290839 master-0 kubenswrapper[17876]: I0313 10:54:05.290748 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-22jb5_ec33c506-8abe-4659-84d3-a294c31b446c/manager/1.log" Mar 13 10:54:05.291875 master-0 kubenswrapper[17876]: I0313 10:54:05.291280 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" event={"ID":"ec33c506-8abe-4659-84d3-a294c31b446c","Type":"ContainerStarted","Data":"063c6c1a6a3827b6e41b9b977d0be19a1cd949db6b91ad35d0b36cb5a9690367"} Mar 13 10:54:05.291875 master-0 kubenswrapper[17876]: I0313 10:54:05.291753 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:54:06.688153 master-0 kubenswrapper[17876]: E0313 10:54:06.687778 17876 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{metallb-operator-controller-manager-57755f98f6-7pnfb.189c61186786866a metallb-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:metallb-system,Name:metallb-operator-controller-manager-57755f98f6-7pnfb,UID:eb773726-9950-4399-871b-815d20abe38c,APIVersion:v1,ResourceVersion:17961,FieldPath:spec.containers{manager},},Reason:Created,Message:Created container: manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.721111658 +0000 UTC m=+618.556918134,LastTimestamp:2026-03-13 10:51:50.721111658 +0000 UTC m=+618.556918134,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:54:07.344435 master-0 kubenswrapper[17876]: E0313 10:54:07.344313 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:54:09.771749 master-0 kubenswrapper[17876]: E0313 10:54:09.771654 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 10:54:10.645695 master-0 kubenswrapper[17876]: I0313 10:54:10.645636 17876 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="f37ce15bd68c50f1cdfbe350a200b600ab0e8c08fff0cc03a95cc5396b8fc09e" exitCode=0 Mar 13 10:54:10.646068 master-0 kubenswrapper[17876]: I0313 10:54:10.645698 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"f37ce15bd68c50f1cdfbe350a200b600ab0e8c08fff0cc03a95cc5396b8fc09e"} Mar 13 10:54:10.646776 master-0 kubenswrapper[17876]: I0313 10:54:10.646739 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:54:10.647017 master-0 kubenswrapper[17876]: I0313 10:54:10.646988 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:54:11.656402 master-0 kubenswrapper[17876]: I0313 10:54:11.656345 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/2.log" Mar 13 10:54:11.656973 master-0 kubenswrapper[17876]: I0313 10:54:11.656943 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/1.log" Mar 13 10:54:11.657018 master-0 kubenswrapper[17876]: I0313 10:54:11.656998 17876 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb" exitCode=1 Mar 13 10:54:11.657051 master-0 kubenswrapper[17876]: I0313 10:54:11.657030 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb"} Mar 13 10:54:11.657089 master-0 kubenswrapper[17876]: I0313 10:54:11.657067 17876 scope.go:117] "RemoveContainer" containerID="1a464f39fb3c28eac1b441005b20c015c22b43034a9004d421103f0a297535d2" Mar 13 10:54:11.658428 master-0 kubenswrapper[17876]: I0313 10:54:11.658391 17876 scope.go:117] "RemoveContainer" containerID="b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb" Mar 13 10:54:11.659027 master-0 kubenswrapper[17876]: E0313 10:54:11.658877 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:54:12.188842 master-0 kubenswrapper[17876]: I0313 10:54:12.187967 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:53798->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 13 10:54:12.188842 master-0 kubenswrapper[17876]: I0313 10:54:12.188056 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:53798->127.0.0.1:10357: read: connection reset by peer" Mar 13 10:54:12.188842 master-0 kubenswrapper[17876]: I0313 10:54:12.188186 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:12.668279 master-0 kubenswrapper[17876]: I0313 10:54:12.668214 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/2.log" Mar 13 10:54:12.671074 master-0 kubenswrapper[17876]: I0313 10:54:12.670975 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/2.log" Mar 13 10:54:12.671732 master-0 kubenswrapper[17876]: I0313 10:54:12.671648 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/1.log" Mar 13 10:54:12.673178 master-0 kubenswrapper[17876]: I0313 10:54:12.672928 17876 generic.go:334] "Generic (PLEG): container finished" podID="d4c95608e26ddbbd2e5890fcd9f507b5" containerID="f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a" exitCode=255 Mar 13 10:54:12.673178 master-0 kubenswrapper[17876]: I0313 10:54:12.673008 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerDied","Data":"f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a"} Mar 13 10:54:12.673178 master-0 kubenswrapper[17876]: I0313 10:54:12.673049 17876 scope.go:117] "RemoveContainer" containerID="a0a22034c3dde350367ecabb82d0252125380af67f03b055a1e177ceaee53cd5" Mar 13 10:54:12.678863 master-0 kubenswrapper[17876]: I0313 10:54:12.678797 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 13 10:54:12.679761 master-0 kubenswrapper[17876]: I0313 10:54:12.679685 17876 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="9f53cc3cddb8fe9d1088b7766a1921dd54985febb851e44b5536925b781b058e" exitCode=1 Mar 13 10:54:12.679981 master-0 kubenswrapper[17876]: I0313 10:54:12.679767 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"9f53cc3cddb8fe9d1088b7766a1921dd54985febb851e44b5536925b781b058e"} Mar 13 10:54:12.680992 master-0 kubenswrapper[17876]: I0313 10:54:12.680915 17876 scope.go:117] "RemoveContainer" containerID="9f53cc3cddb8fe9d1088b7766a1921dd54985febb851e44b5536925b781b058e" Mar 13 10:54:13.693405 master-0 kubenswrapper[17876]: I0313 10:54:13.693332 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 13 10:54:13.694252 master-0 kubenswrapper[17876]: I0313 10:54:13.693855 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"c53b1440b383762cfabf937882516ed92d78bde19337990cdc6393c7a9e23492"} Mar 13 10:54:13.694411 master-0 kubenswrapper[17876]: I0313 10:54:13.694297 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:54:13.695832 master-0 kubenswrapper[17876]: I0313 10:54:13.695791 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/2.log" Mar 13 10:54:13.698867 master-0 kubenswrapper[17876]: I0313 10:54:13.698825 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-svqcp_a0917212-59d8-4799-a9bc-52e358c5e8a0/machine-api-operator/0.log" Mar 13 10:54:13.699252 master-0 kubenswrapper[17876]: I0313 10:54:13.699216 17876 generic.go:334] "Generic (PLEG): container finished" podID="a0917212-59d8-4799-a9bc-52e358c5e8a0" containerID="ad75c939343bfb30bc5319b14b8035776ee4b1b3343e77f1374907643eae75c7" exitCode=255 Mar 13 10:54:13.699338 master-0 kubenswrapper[17876]: I0313 10:54:13.699252 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerDied","Data":"ad75c939343bfb30bc5319b14b8035776ee4b1b3343e77f1374907643eae75c7"} Mar 13 10:54:13.699710 master-0 kubenswrapper[17876]: I0313 10:54:13.699681 17876 scope.go:117] "RemoveContainer" containerID="ad75c939343bfb30bc5319b14b8035776ee4b1b3343e77f1374907643eae75c7" Mar 13 10:54:14.713460 master-0 kubenswrapper[17876]: I0313 10:54:14.713379 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-svqcp_a0917212-59d8-4799-a9bc-52e358c5e8a0/machine-api-operator/0.log" Mar 13 10:54:14.714341 master-0 kubenswrapper[17876]: I0313 10:54:14.714296 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-svqcp" event={"ID":"a0917212-59d8-4799-a9bc-52e358c5e8a0","Type":"ContainerStarted","Data":"a14010cebdbd50ff2388c22869254fcad2f743035e73a8d850f10ea69d6dbbbd"} Mar 13 10:54:15.725653 master-0 kubenswrapper[17876]: I0313 10:54:15.725555 17876 generic.go:334] "Generic (PLEG): container finished" podID="193b3b95-f9a3-4272-853b-86366ce348a2" containerID="b87c048ad8f6b66600aef035430a3c74694d425a7990645314c96636905e37f6" exitCode=0 Mar 13 10:54:15.725653 master-0 kubenswrapper[17876]: I0313 10:54:15.725633 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerDied","Data":"b87c048ad8f6b66600aef035430a3c74694d425a7990645314c96636905e37f6"} Mar 13 10:54:15.726636 master-0 kubenswrapper[17876]: I0313 10:54:15.725688 17876 scope.go:117] "RemoveContainer" containerID="ebe9d6845712ab71dcaca65a6bc117d393841747dc9f910db7e844f9d2c310ac" Mar 13 10:54:15.726636 master-0 kubenswrapper[17876]: I0313 10:54:15.726614 17876 scope.go:117] "RemoveContainer" containerID="b87c048ad8f6b66600aef035430a3c74694d425a7990645314c96636905e37f6" Mar 13 10:54:15.858905 master-0 kubenswrapper[17876]: E0313 10:54:15.858825 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:54:05Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:54:05Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:54:05Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-13T10:54:05Z\\\",\\\"type\\\":\\\"Ready\\\"}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:15.860207 master-0 kubenswrapper[17876]: E0313 10:54:15.860070 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:15.860532 master-0 kubenswrapper[17876]: I0313 10:54:15.860471 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 10:54:15.860676 master-0 kubenswrapper[17876]: I0313 10:54:15.860593 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" containerID="cri-o://f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a" gracePeriod=30 Mar 13 10:54:16.737281 master-0 kubenswrapper[17876]: I0313 10:54:16.737191 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-ns7z7" event={"ID":"193b3b95-f9a3-4272-853b-86366ce348a2","Type":"ContainerStarted","Data":"10b1de6a68629f30fe9888d3f63f4fd9cc44d7ace9343ebe043fb3efaea9c6bb"} Mar 13 10:54:16.740042 master-0 kubenswrapper[17876]: I0313 10:54:16.739985 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/2.log" Mar 13 10:54:16.741301 master-0 kubenswrapper[17876]: I0313 10:54:16.741262 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59"} Mar 13 10:54:16.741695 master-0 kubenswrapper[17876]: I0313 10:54:16.741664 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:54:16.741764 master-0 kubenswrapper[17876]: I0313 10:54:16.741700 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:54:17.112786 master-0 kubenswrapper[17876]: I0313 10:54:17.112622 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-22jb5" Mar 13 10:54:19.773045 master-0 kubenswrapper[17876]: I0313 10:54:19.772978 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-942bv_8dc7af5f-ff72-4f06-88df-a26ff4c0bded/machine-approver-controller/0.log" Mar 13 10:54:19.774220 master-0 kubenswrapper[17876]: I0313 10:54:19.773813 17876 generic.go:334] "Generic (PLEG): container finished" podID="8dc7af5f-ff72-4f06-88df-a26ff4c0bded" containerID="f9193ce0cecc29a04837d4cc5243527b46397232b9255d51f28db25efcba2a5f" exitCode=255 Mar 13 10:54:19.774220 master-0 kubenswrapper[17876]: I0313 10:54:19.773861 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerDied","Data":"f9193ce0cecc29a04837d4cc5243527b46397232b9255d51f28db25efcba2a5f"} Mar 13 10:54:19.775141 master-0 kubenswrapper[17876]: I0313 10:54:19.775032 17876 scope.go:117] "RemoveContainer" containerID="f9193ce0cecc29a04837d4cc5243527b46397232b9255d51f28db25efcba2a5f" Mar 13 10:54:20.788130 master-0 kubenswrapper[17876]: I0313 10:54:20.788019 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-942bv_8dc7af5f-ff72-4f06-88df-a26ff4c0bded/machine-approver-controller/0.log" Mar 13 10:54:20.789312 master-0 kubenswrapper[17876]: I0313 10:54:20.788894 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-942bv" event={"ID":"8dc7af5f-ff72-4f06-88df-a26ff4c0bded","Type":"ContainerStarted","Data":"1be4edf04ec1c1618f28a1b357978a11fdd7c73ae13902a785b615ae094e0ed7"} Mar 13 10:54:21.515261 master-0 kubenswrapper[17876]: I0313 10:54:21.515179 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:21.515261 master-0 kubenswrapper[17876]: I0313 10:54:21.515252 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:21.800434 master-0 kubenswrapper[17876]: I0313 10:54:21.800233 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/1.log" Mar 13 10:54:21.803123 master-0 kubenswrapper[17876]: I0313 10:54:21.803055 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/0.log" Mar 13 10:54:21.803222 master-0 kubenswrapper[17876]: I0313 10:54:21.803163 17876 generic.go:334] "Generic (PLEG): container finished" podID="06ecac2e-bffa-474b-a824-9ba4a194159a" containerID="406d6e11697cacd57dcd99d84785c736a52ac48c6ef5c27b81e728ae6e2f38f1" exitCode=1 Mar 13 10:54:21.803304 master-0 kubenswrapper[17876]: I0313 10:54:21.803267 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerDied","Data":"406d6e11697cacd57dcd99d84785c736a52ac48c6ef5c27b81e728ae6e2f38f1"} Mar 13 10:54:21.803361 master-0 kubenswrapper[17876]: I0313 10:54:21.803327 17876 scope.go:117] "RemoveContainer" containerID="77388cc43c658d8351ae6e1b9588c860c29201d049f835cd9a818f43573bd490" Mar 13 10:54:21.804400 master-0 kubenswrapper[17876]: I0313 10:54:21.804305 17876 scope.go:117] "RemoveContainer" containerID="406d6e11697cacd57dcd99d84785c736a52ac48c6ef5c27b81e728ae6e2f38f1" Mar 13 10:54:21.808311 master-0 kubenswrapper[17876]: I0313 10:54:21.808198 17876 generic.go:334] "Generic (PLEG): container finished" podID="1b7e4f08-d451-4e67-8472-4de6270ee72c" containerID="6932138c1740c44c27e9e781bf8fa7fcf05f0501eb723d550fb039d9e4b714bb" exitCode=0 Mar 13 10:54:21.808311 master-0 kubenswrapper[17876]: I0313 10:54:21.808322 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" event={"ID":"1b7e4f08-d451-4e67-8472-4de6270ee72c","Type":"ContainerDied","Data":"6932138c1740c44c27e9e781bf8fa7fcf05f0501eb723d550fb039d9e4b714bb"} Mar 13 10:54:21.809459 master-0 kubenswrapper[17876]: I0313 10:54:21.809347 17876 scope.go:117] "RemoveContainer" containerID="6932138c1740c44c27e9e781bf8fa7fcf05f0501eb723d550fb039d9e4b714bb" Mar 13 10:54:21.815263 master-0 kubenswrapper[17876]: I0313 10:54:21.815191 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/1.log" Mar 13 10:54:21.818333 master-0 kubenswrapper[17876]: I0313 10:54:21.818255 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/0.log" Mar 13 10:54:21.818423 master-0 kubenswrapper[17876]: I0313 10:54:21.818373 17876 generic.go:334] "Generic (PLEG): container finished" podID="0881de70-2db3-4fc2-b976-b55c11dc239d" containerID="df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82" exitCode=1 Mar 13 10:54:21.818502 master-0 kubenswrapper[17876]: I0313 10:54:21.818466 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerDied","Data":"df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82"} Mar 13 10:54:21.820078 master-0 kubenswrapper[17876]: I0313 10:54:21.820032 17876 scope.go:117] "RemoveContainer" containerID="df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82" Mar 13 10:54:21.858267 master-0 kubenswrapper[17876]: I0313 10:54:21.858186 17876 scope.go:117] "RemoveContainer" containerID="d8bc48fd76e9fa9cf8445927e8d3f1d04ebf5cde90355e9a3c408980f39f3829" Mar 13 10:54:22.598366 master-0 kubenswrapper[17876]: I0313 10:54:22.598317 17876 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:54:22.598366 master-0 kubenswrapper[17876]: I0313 10:54:22.598364 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:54:22.826594 master-0 kubenswrapper[17876]: I0313 10:54:22.826536 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-d5flg_06ecac2e-bffa-474b-a824-9ba4a194159a/control-plane-machine-set-operator/1.log" Mar 13 10:54:22.827386 master-0 kubenswrapper[17876]: I0313 10:54:22.826651 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-d5flg" event={"ID":"06ecac2e-bffa-474b-a824-9ba4a194159a","Type":"ContainerStarted","Data":"3db60c8c343d538c9cb92d5bf6d1587bad44b37a9baaf34e668dcf9dffdb3381"} Mar 13 10:54:22.828564 master-0 kubenswrapper[17876]: I0313 10:54:22.828519 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" event={"ID":"1b7e4f08-d451-4e67-8472-4de6270ee72c","Type":"ContainerStarted","Data":"45dad7ae87d6af63a37621db5cdc2d06eae137531c0f46a1652e8bd3757c1ebb"} Mar 13 10:54:22.828814 master-0 kubenswrapper[17876]: I0313 10:54:22.828774 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:54:22.830323 master-0 kubenswrapper[17876]: I0313 10:54:22.830290 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/1.log" Mar 13 10:54:22.830613 master-0 kubenswrapper[17876]: I0313 10:54:22.830574 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999"} Mar 13 10:54:22.835115 master-0 kubenswrapper[17876]: I0313 10:54:22.835055 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5dcc5796bd-kgx74" Mar 13 10:54:24.345607 master-0 kubenswrapper[17876]: E0313 10:54:24.345446 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="7s" Mar 13 10:54:24.494764 master-0 kubenswrapper[17876]: I0313 10:54:24.494670 17876 scope.go:117] "RemoveContainer" containerID="b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb" Mar 13 10:54:24.516323 master-0 kubenswrapper[17876]: I0313 10:54:24.516231 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:54:24.516620 master-0 kubenswrapper[17876]: I0313 10:54:24.516338 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:24.851304 master-0 kubenswrapper[17876]: I0313 10:54:24.851244 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/2.log" Mar 13 10:54:24.851594 master-0 kubenswrapper[17876]: I0313 10:54:24.851554 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7"} Mar 13 10:54:25.860017 master-0 kubenswrapper[17876]: E0313 10:54:25.859855 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:29.900795 master-0 kubenswrapper[17876]: I0313 10:54:29.900735 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-p7qlt_e4b55ebf-cab8-4985-95cc-b28bc5ae0578/cluster-autoscaler-operator/0.log" Mar 13 10:54:29.902937 master-0 kubenswrapper[17876]: I0313 10:54:29.902887 17876 generic.go:334] "Generic (PLEG): container finished" podID="e4b55ebf-cab8-4985-95cc-b28bc5ae0578" containerID="8629ec87935b9c8163acca5e90c43ffc35598371cd514995496e1b481f1cd153" exitCode=255 Mar 13 10:54:29.903178 master-0 kubenswrapper[17876]: I0313 10:54:29.903039 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerDied","Data":"8629ec87935b9c8163acca5e90c43ffc35598371cd514995496e1b481f1cd153"} Mar 13 10:54:29.904241 master-0 kubenswrapper[17876]: I0313 10:54:29.904184 17876 scope.go:117] "RemoveContainer" containerID="8629ec87935b9c8163acca5e90c43ffc35598371cd514995496e1b481f1cd153" Mar 13 10:54:30.916844 master-0 kubenswrapper[17876]: I0313 10:54:30.916753 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-p7qlt_e4b55ebf-cab8-4985-95cc-b28bc5ae0578/cluster-autoscaler-operator/0.log" Mar 13 10:54:30.918248 master-0 kubenswrapper[17876]: I0313 10:54:30.918184 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-p7qlt" event={"ID":"e4b55ebf-cab8-4985-95cc-b28bc5ae0578","Type":"ContainerStarted","Data":"ce8b8549f5bd558244fc744659b7799fcbcc750a376d7213bb3003811914c055"} Mar 13 10:54:33.996936 master-0 kubenswrapper[17876]: I0313 10:54:33.996854 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 13 10:54:33.998073 master-0 kubenswrapper[17876]: I0313 10:54:33.997831 17876 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="e2c88e4c1fc855558d16a23967e91f39d888b2b5d567204372568c3c9fe0b418" exitCode=0 Mar 13 10:54:33.998073 master-0 kubenswrapper[17876]: I0313 10:54:33.997910 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"e2c88e4c1fc855558d16a23967e91f39d888b2b5d567204372568c3c9fe0b418"} Mar 13 10:54:33.999036 master-0 kubenswrapper[17876]: I0313 10:54:33.998986 17876 scope.go:117] "RemoveContainer" containerID="e2c88e4c1fc855558d16a23967e91f39d888b2b5d567204372568c3c9fe0b418" Mar 13 10:54:34.516230 master-0 kubenswrapper[17876]: I0313 10:54:34.515941 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:54:34.516230 master-0 kubenswrapper[17876]: I0313 10:54:34.516063 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:35.025336 master-0 kubenswrapper[17876]: I0313 10:54:35.025203 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1453f6461bf5d599ad65a4656343ee91/kube-scheduler/0.log" Mar 13 10:54:35.026430 master-0 kubenswrapper[17876]: I0313 10:54:35.026031 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"2113fec036a1f7b7b9b89d157be7afca90f31d0aee1808803dffb0f817e0a10d"} Mar 13 10:54:35.860856 master-0 kubenswrapper[17876]: E0313 10:54:35.860774 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:40.693715 master-0 kubenswrapper[17876]: E0313 10:54:40.693455 17876 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{metallb-operator-controller-manager-57755f98f6-7pnfb.189c6118685364a5 metallb-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:metallb-system,Name:metallb-operator-controller-manager-57755f98f6-7pnfb,UID:eb773726-9950-4399-871b-815d20abe38c,APIVersion:v1,ResourceVersion:17961,FieldPath:spec.containers{manager},},Reason:Started,Message:Started container manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-13 10:51:50.734537893 +0000 UTC m=+618.570344369,LastTimestamp:2026-03-13 10:51:50.734537893 +0000 UTC m=+618.570344369,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 13 10:54:41.346490 master-0 kubenswrapper[17876]: E0313 10:54:41.346358 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:54:44.515727 master-0 kubenswrapper[17876]: I0313 10:54:44.515573 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:54:44.515727 master-0 kubenswrapper[17876]: I0313 10:54:44.515666 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:44.515727 master-0 kubenswrapper[17876]: I0313 10:54:44.515731 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:44.651006 master-0 kubenswrapper[17876]: E0313 10:54:44.650917 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 13 10:54:45.148242 master-0 kubenswrapper[17876]: I0313 10:54:45.147800 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"03199ce9469c2699b25c4c21ec3f21abf49fdd9b9655ef9a2eeedef923552c6b"} Mar 13 10:54:45.869482 master-0 kubenswrapper[17876]: E0313 10:54:45.869316 17876 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:46.165840 master-0 kubenswrapper[17876]: I0313 10:54:46.165725 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"12ef72b6f6d0bee6a4374d7a29a2237fb58bc5c09e0cf722a8c6df122f42ee00"} Mar 13 10:54:46.166174 master-0 kubenswrapper[17876]: I0313 10:54:46.165870 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"987624905af00c40f6927a98008a8a04f1ba3800775bf9d49d0fc65090e9f34c"} Mar 13 10:54:46.166174 master-0 kubenswrapper[17876]: I0313 10:54:46.165902 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"584c682792d45ea43957e249506ce0f53a65341df5a061b548ebc91dbfff3bb3"} Mar 13 10:54:46.166174 master-0 kubenswrapper[17876]: I0313 10:54:46.165932 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"531c5b37021de373e6d4540858fda613d71f49e4eb1f9e6ab5fd720ca80c30db"} Mar 13 10:54:46.166372 master-0 kubenswrapper[17876]: I0313 10:54:46.166261 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:54:46.166372 master-0 kubenswrapper[17876]: I0313 10:54:46.166315 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:54:47.179152 master-0 kubenswrapper[17876]: I0313 10:54:47.178965 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/3.log" Mar 13 10:54:47.180740 master-0 kubenswrapper[17876]: I0313 10:54:47.180676 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/2.log" Mar 13 10:54:47.182659 master-0 kubenswrapper[17876]: I0313 10:54:47.182615 17876 generic.go:334] "Generic (PLEG): container finished" podID="d4c95608e26ddbbd2e5890fcd9f507b5" containerID="cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59" exitCode=255 Mar 13 10:54:47.182848 master-0 kubenswrapper[17876]: I0313 10:54:47.182678 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerDied","Data":"cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59"} Mar 13 10:54:47.183045 master-0 kubenswrapper[17876]: I0313 10:54:47.183021 17876 scope.go:117] "RemoveContainer" containerID="f5ead5a0b7301d0c828a45a6e8f562bb6ae051522693311fc222d83af77b779a" Mar 13 10:54:48.191998 master-0 kubenswrapper[17876]: I0313 10:54:48.191948 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/3.log" Mar 13 10:54:50.585195 master-0 kubenswrapper[17876]: I0313 10:54:50.585087 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 13 10:54:50.585195 master-0 kubenswrapper[17876]: I0313 10:54:50.585188 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 13 10:54:50.745168 master-0 kubenswrapper[17876]: E0313 10:54:50.745049 17876 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:50.745675 master-0 kubenswrapper[17876]: I0313 10:54:50.745619 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 10:54:50.745902 master-0 kubenswrapper[17876]: I0313 10:54:50.745849 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" containerID="cri-o://cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59" gracePeriod=30 Mar 13 10:54:51.123654 master-0 kubenswrapper[17876]: I0313 10:54:51.123529 17876 status_manager.go:851] "Failed to get status for pod" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods csi-snapshot-controller-7577d6f48-kcw4k)" Mar 13 10:54:51.222492 master-0 kubenswrapper[17876]: I0313 10:54:51.222332 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/3.log" Mar 13 10:54:51.223415 master-0 kubenswrapper[17876]: I0313 10:54:51.223365 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e"} Mar 13 10:54:51.223732 master-0 kubenswrapper[17876]: I0313 10:54:51.223701 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:54:51.223732 master-0 kubenswrapper[17876]: I0313 10:54:51.223726 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="9f5ef835-aaae-4010-8d21-ae14dbdeff57" Mar 13 10:54:51.515389 master-0 kubenswrapper[17876]: I0313 10:54:51.515204 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:51.515597 master-0 kubenswrapper[17876]: I0313 10:54:51.515413 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:54.516195 master-0 kubenswrapper[17876]: I0313 10:54:54.516135 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:54:54.516810 master-0 kubenswrapper[17876]: I0313 10:54:54.516219 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:54:55.265849 master-0 kubenswrapper[17876]: I0313 10:54:55.265797 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/3.log" Mar 13 10:54:55.266386 master-0 kubenswrapper[17876]: I0313 10:54:55.266356 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/2.log" Mar 13 10:54:55.266434 master-0 kubenswrapper[17876]: I0313 10:54:55.266410 17876 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7" exitCode=1 Mar 13 10:54:55.266480 master-0 kubenswrapper[17876]: I0313 10:54:55.266454 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7"} Mar 13 10:54:55.266535 master-0 kubenswrapper[17876]: I0313 10:54:55.266507 17876 scope.go:117] "RemoveContainer" containerID="b75fbf8a5624be5a03d5c44ed09ccde072e0ccf0feb8c353317c2e50de8d72cb" Mar 13 10:54:55.267148 master-0 kubenswrapper[17876]: I0313 10:54:55.267089 17876 scope.go:117] "RemoveContainer" containerID="4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7" Mar 13 10:54:55.267490 master-0 kubenswrapper[17876]: E0313 10:54:55.267458 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:54:56.085114 master-0 kubenswrapper[17876]: I0313 10:54:56.085015 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:54:56.088707 master-0 kubenswrapper[17876]: I0313 10:54:56.088666 17876 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 13 10:54:56.141910 master-0 kubenswrapper[17876]: I0313 10:54:56.141826 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:54:56.151320 master-0 kubenswrapper[17876]: I0313 10:54:56.151245 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:54:56.175128 master-0 kubenswrapper[17876]: I0313 10:54:56.174180 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:54:56.191411 master-0 kubenswrapper[17876]: I0313 10:54:56.183378 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:54:56.216350 master-0 kubenswrapper[17876]: I0313 10:54:56.216244 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-57755f98f6-7pnfb" podStartSLOduration=187.533905985 podStartE2EDuration="3m13.216205575s" podCreationTimestamp="2026-03-13 10:51:43 +0000 UTC" firstStartedPulling="2026-03-13 10:51:44.802768147 +0000 UTC m=+612.638574613" lastFinishedPulling="2026-03-13 10:51:50.485067727 +0000 UTC m=+618.320874203" observedRunningTime="2026-03-13 10:54:56.21463472 +0000 UTC m=+804.050441206" watchObservedRunningTime="2026-03-13 10:54:56.216205575 +0000 UTC m=+804.052012071" Mar 13 10:54:56.277453 master-0 kubenswrapper[17876]: I0313 10:54:56.277380 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/3.log" Mar 13 10:54:58.355522 master-0 kubenswrapper[17876]: E0313 10:54:58.355446 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="the server was unable to return a response in the time allotted, but may still be processing the request (get leases.coordination.k8s.io master-0)" interval="7s" Mar 13 10:54:58.943928 master-0 kubenswrapper[17876]: I0313 10:54:58.943868 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 13 10:54:58.950284 master-0 kubenswrapper[17876]: I0313 10:54:58.950223 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 13 10:54:59.311068 master-0 kubenswrapper[17876]: I0313 10:54:59.310928 17876 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:54:59.311068 master-0 kubenswrapper[17876]: I0313 10:54:59.310961 17876 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="a92428d0-0dd8-4dc0-9ad1-98650c200008" Mar 13 10:55:00.612040 master-0 kubenswrapper[17876]: I0313 10:55:00.611951 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 13 10:55:00.655088 master-0 kubenswrapper[17876]: I0313 10:55:00.652822 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=2.6528044509999997 podStartE2EDuration="2.652804451s" podCreationTimestamp="2026-03-13 10:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:55:00.648020206 +0000 UTC m=+808.483826712" watchObservedRunningTime="2026-03-13 10:55:00.652804451 +0000 UTC m=+808.488610937" Mar 13 10:55:00.683888 master-0 kubenswrapper[17876]: I0313 10:55:00.683794 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.6837735289999998 podStartE2EDuration="2.683773529s" podCreationTimestamp="2026-03-13 10:54:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:55:00.679994002 +0000 UTC m=+808.515800498" watchObservedRunningTime="2026-03-13 10:55:00.683773529 +0000 UTC m=+808.519580015" Mar 13 10:55:01.341252 master-0 kubenswrapper[17876]: I0313 10:55:01.341124 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 13 10:55:04.516767 master-0 kubenswrapper[17876]: I0313 10:55:04.516657 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:55:04.516767 master-0 kubenswrapper[17876]: I0313 10:55:04.516740 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:55:06.513926 master-0 kubenswrapper[17876]: I0313 10:55:06.513860 17876 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Readiness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:55:06.513926 master-0 kubenswrapper[17876]: I0313 10:55:06.513933 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:55:06.513926 master-0 kubenswrapper[17876]: I0313 10:55:06.513947 17876 patch_prober.go:28] interesting pod/openshift-kube-scheduler-master-0 container/kube-scheduler namespace/openshift-kube-scheduler: Liveness probe status=failure output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:55:06.514774 master-0 kubenswrapper[17876]: I0313 10:55:06.514038 17876 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1453f6461bf5d599ad65a4656343ee91" containerName="kube-scheduler" probeResult="failure" output="Get \"https://192.168.32.10:10259/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:55:08.494589 master-0 kubenswrapper[17876]: I0313 10:55:08.494519 17876 scope.go:117] "RemoveContainer" containerID="4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7" Mar 13 10:55:08.495439 master-0 kubenswrapper[17876]: E0313 10:55:08.494789 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:55:14.516292 master-0 kubenswrapper[17876]: I0313 10:55:14.515927 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:55:14.516292 master-0 kubenswrapper[17876]: I0313 10:55:14.516039 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:55:14.516292 master-0 kubenswrapper[17876]: I0313 10:55:14.516157 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:55:14.517747 master-0 kubenswrapper[17876]: I0313 10:55:14.517504 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 13 10:55:14.517747 master-0 kubenswrapper[17876]: I0313 10:55:14.517690 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" containerID="cri-o://5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" gracePeriod=30 Mar 13 10:55:14.644446 master-0 kubenswrapper[17876]: E0313 10:55:14.644372 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(d4c95608e26ddbbd2e5890fcd9f507b5)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:55:15.356896 master-0 kubenswrapper[17876]: E0313 10:55:15.356765 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:55:15.496857 master-0 kubenswrapper[17876]: I0313 10:55:15.496783 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/4.log" Mar 13 10:55:15.497485 master-0 kubenswrapper[17876]: I0313 10:55:15.497435 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/3.log" Mar 13 10:55:15.499039 master-0 kubenswrapper[17876]: I0313 10:55:15.498808 17876 generic.go:334] "Generic (PLEG): container finished" podID="d4c95608e26ddbbd2e5890fcd9f507b5" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" exitCode=255 Mar 13 10:55:15.499179 master-0 kubenswrapper[17876]: I0313 10:55:15.499076 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerDied","Data":"5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e"} Mar 13 10:55:15.499294 master-0 kubenswrapper[17876]: I0313 10:55:15.499259 17876 scope.go:117] "RemoveContainer" containerID="cbedeb377940e4c1f57020e4918ca1d5cca1e728e2596ddaaf6bd86b6efb5a59" Mar 13 10:55:15.500164 master-0 kubenswrapper[17876]: I0313 10:55:15.500087 17876 scope.go:117] "RemoveContainer" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" Mar 13 10:55:15.500636 master-0 kubenswrapper[17876]: E0313 10:55:15.500590 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(d4c95608e26ddbbd2e5890fcd9f507b5)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:55:15.519456 master-0 kubenswrapper[17876]: I0313 10:55:15.519402 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 13 10:55:16.509844 master-0 kubenswrapper[17876]: I0313 10:55:16.509782 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/4.log" Mar 13 10:55:21.515356 master-0 kubenswrapper[17876]: I0313 10:55:21.515253 17876 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:55:21.519749 master-0 kubenswrapper[17876]: I0313 10:55:21.519693 17876 scope.go:117] "RemoveContainer" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" Mar 13 10:55:21.520127 master-0 kubenswrapper[17876]: E0313 10:55:21.520064 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(d4c95608e26ddbbd2e5890fcd9f507b5)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:55:22.504980 master-0 kubenswrapper[17876]: I0313 10:55:22.504907 17876 scope.go:117] "RemoveContainer" containerID="4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7" Mar 13 10:55:22.619175 master-0 kubenswrapper[17876]: I0313 10:55:22.618676 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/2.log" Mar 13 10:55:22.621308 master-0 kubenswrapper[17876]: I0313 10:55:22.621269 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/1.log" Mar 13 10:55:22.621726 master-0 kubenswrapper[17876]: I0313 10:55:22.621688 17876 generic.go:334] "Generic (PLEG): container finished" podID="0881de70-2db3-4fc2-b976-b55c11dc239d" containerID="a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999" exitCode=1 Mar 13 10:55:22.621892 master-0 kubenswrapper[17876]: I0313 10:55:22.621766 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerDied","Data":"a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999"} Mar 13 10:55:22.622068 master-0 kubenswrapper[17876]: I0313 10:55:22.622039 17876 scope.go:117] "RemoveContainer" containerID="df8dbee9c77b0ca318382f012c8a23d7d342a4f43e0448369274b4a7e9be8d82" Mar 13 10:55:22.622830 master-0 kubenswrapper[17876]: I0313 10:55:22.622726 17876 scope.go:117] "RemoveContainer" containerID="a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999" Mar 13 10:55:22.623021 master-0 kubenswrapper[17876]: E0313 10:55:22.622993 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-2c4sl_openshift-machine-api(0881de70-2db3-4fc2-b976-b55c11dc239d)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" podUID="0881de70-2db3-4fc2-b976-b55c11dc239d" Mar 13 10:55:23.634236 master-0 kubenswrapper[17876]: I0313 10:55:23.634158 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/3.log" Mar 13 10:55:23.634888 master-0 kubenswrapper[17876]: I0313 10:55:23.634298 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748"} Mar 13 10:55:23.637219 master-0 kubenswrapper[17876]: I0313 10:55:23.636976 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/2.log" Mar 13 10:55:32.358920 master-0 kubenswrapper[17876]: E0313 10:55:32.358815 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:55:35.494483 master-0 kubenswrapper[17876]: I0313 10:55:35.494415 17876 scope.go:117] "RemoveContainer" containerID="a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999" Mar 13 10:55:35.495354 master-0 kubenswrapper[17876]: I0313 10:55:35.495301 17876 scope.go:117] "RemoveContainer" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" Mar 13 10:55:35.495740 master-0 kubenswrapper[17876]: E0313 10:55:35.495703 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(d4c95608e26ddbbd2e5890fcd9f507b5)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:55:35.745244 master-0 kubenswrapper[17876]: I0313 10:55:35.745083 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/2.log" Mar 13 10:55:35.745547 master-0 kubenswrapper[17876]: I0313 10:55:35.745481 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf"} Mar 13 10:55:49.360665 master-0 kubenswrapper[17876]: E0313 10:55:49.360456 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:55:50.494550 master-0 kubenswrapper[17876]: I0313 10:55:50.494464 17876 scope.go:117] "RemoveContainer" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" Mar 13 10:55:50.495203 master-0 kubenswrapper[17876]: E0313 10:55:50.494738 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(d4c95608e26ddbbd2e5890fcd9f507b5)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" Mar 13 10:55:52.948714 master-0 kubenswrapper[17876]: I0313 10:55:52.948639 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/4.log" Mar 13 10:55:52.949564 master-0 kubenswrapper[17876]: I0313 10:55:52.949492 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/3.log" Mar 13 10:55:52.949657 master-0 kubenswrapper[17876]: I0313 10:55:52.949628 17876 generic.go:334] "Generic (PLEG): container finished" podID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" exitCode=1 Mar 13 10:55:52.949732 master-0 kubenswrapper[17876]: I0313 10:55:52.949674 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerDied","Data":"c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748"} Mar 13 10:55:52.949732 master-0 kubenswrapper[17876]: I0313 10:55:52.949720 17876 scope.go:117] "RemoveContainer" containerID="4ff52f9e6736f5cab4ce6578d4344e93b6e1f872b5fbfd311914f862e31091c7" Mar 13 10:55:52.950259 master-0 kubenswrapper[17876]: I0313 10:55:52.950225 17876 scope.go:117] "RemoveContainer" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" Mar 13 10:55:52.950638 master-0 kubenswrapper[17876]: E0313 10:55:52.950522 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:55:53.958951 master-0 kubenswrapper[17876]: I0313 10:55:53.958887 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/4.log" Mar 13 10:56:02.500834 master-0 kubenswrapper[17876]: I0313 10:56:02.500743 17876 scope.go:117] "RemoveContainer" containerID="5200fad00dafca12535688f0851a3f05c06380d9761a3d72d71eb1819fa23e3e" Mar 13 10:56:03.045798 master-0 kubenswrapper[17876]: I0313 10:56:03.045601 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_d4c95608e26ddbbd2e5890fcd9f507b5/cluster-policy-controller/4.log" Mar 13 10:56:03.046832 master-0 kubenswrapper[17876]: I0313 10:56:03.046755 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"d4c95608e26ddbbd2e5890fcd9f507b5","Type":"ContainerStarted","Data":"f576aeac51cfdcf837566b232c52641dad5789e2893b0e009409e3ef36ff7387"} Mar 13 10:56:04.496666 master-0 kubenswrapper[17876]: I0313 10:56:04.496614 17876 scope.go:117] "RemoveContainer" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" Mar 13 10:56:04.498270 master-0 kubenswrapper[17876]: E0313 10:56:04.498240 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:56:06.362565 master-0 kubenswrapper[17876]: E0313 10:56:06.362439 17876 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 13 10:56:11.516247 master-0 kubenswrapper[17876]: I0313 10:56:11.515633 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:56:11.516247 master-0 kubenswrapper[17876]: I0313 10:56:11.515729 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:56:12.405268 master-0 kubenswrapper[17876]: I0313 10:56:12.405127 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" start-of-body= Mar 13 10:56:12.405268 master-0 kubenswrapper[17876]: I0313 10:56:12.405197 17876 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" Mar 13 10:56:12.407771 master-0 kubenswrapper[17876]: I0313 10:56:12.407704 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" start-of-body= Mar 13 10:56:12.407771 master-0 kubenswrapper[17876]: I0313 10:56:12.407752 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": dial tcp 10.128.0.15:8443: connect: connection refused" Mar 13 10:56:13.128619 master-0 kubenswrapper[17876]: I0313 10:56:13.128556 17876 generic.go:334] "Generic (PLEG): container finished" podID="ba3e43ba-2840-4612-a370-87ad3c5a382a" containerID="d028fc794a246b2460076d0dced5db6f65d2c7474177aae275ffc67970fe251d" exitCode=0 Mar 13 10:56:13.129629 master-0 kubenswrapper[17876]: I0313 10:56:13.128657 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerDied","Data":"d028fc794a246b2460076d0dced5db6f65d2c7474177aae275ffc67970fe251d"} Mar 13 10:56:13.129629 master-0 kubenswrapper[17876]: I0313 10:56:13.129397 17876 scope.go:117] "RemoveContainer" containerID="d028fc794a246b2460076d0dced5db6f65d2c7474177aae275ffc67970fe251d" Mar 13 10:56:13.130807 master-0 kubenswrapper[17876]: I0313 10:56:13.130764 17876 generic.go:334] "Generic (PLEG): container finished" podID="e87ca16c-25de-4fea-b900-2960f4a5f95e" containerID="02539d7838ebb483ffcca293d983b439f593e30b5eaf03def36de01bbe1607e5" exitCode=0 Mar 13 10:56:13.130888 master-0 kubenswrapper[17876]: I0313 10:56:13.130841 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerDied","Data":"02539d7838ebb483ffcca293d983b439f593e30b5eaf03def36de01bbe1607e5"} Mar 13 10:56:13.131433 master-0 kubenswrapper[17876]: I0313 10:56:13.131191 17876 scope.go:117] "RemoveContainer" containerID="02539d7838ebb483ffcca293d983b439f593e30b5eaf03def36de01bbe1607e5" Mar 13 10:56:13.133552 master-0 kubenswrapper[17876]: I0313 10:56:13.133413 17876 generic.go:334] "Generic (PLEG): container finished" podID="1f358d81-87c6-40bf-89e8-5681429285f8" containerID="b5048988f4d14da58f4ecce60f1b0f53c921c94b9f30bb0d6da211a5c6a3196b" exitCode=0 Mar 13 10:56:13.133552 master-0 kubenswrapper[17876]: I0313 10:56:13.133480 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerDied","Data":"b5048988f4d14da58f4ecce60f1b0f53c921c94b9f30bb0d6da211a5c6a3196b"} Mar 13 10:56:13.134793 master-0 kubenswrapper[17876]: I0313 10:56:13.133794 17876 scope.go:117] "RemoveContainer" containerID="b5048988f4d14da58f4ecce60f1b0f53c921c94b9f30bb0d6da211a5c6a3196b" Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.136966 17876 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63" exitCode=0 Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.137015 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63"} Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.137085 17876 scope.go:117] "RemoveContainer" containerID="5453f8e7d2354fdecf5aaa7a8c779183aeebd89bd33b88fc63b38c312ff6ebc3" Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.137552 17876 scope.go:117] "RemoveContainer" containerID="80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63" Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.141674 17876 generic.go:334] "Generic (PLEG): container finished" podID="36d4251d3504cdc0ec85144c1379056c" containerID="7d795b2a4120951cab58f7dc86deb216ded65952d82db5b03d506bcb6832ee11" exitCode=0 Mar 13 10:56:13.141847 master-0 kubenswrapper[17876]: I0313 10:56:13.141804 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerDied","Data":"7d795b2a4120951cab58f7dc86deb216ded65952d82db5b03d506bcb6832ee11"} Mar 13 10:56:13.143240 master-0 kubenswrapper[17876]: I0313 10:56:13.142512 17876 scope.go:117] "RemoveContainer" containerID="7d795b2a4120951cab58f7dc86deb216ded65952d82db5b03d506bcb6832ee11" Mar 13 10:56:13.146191 master-0 kubenswrapper[17876]: I0313 10:56:13.145608 17876 generic.go:334] "Generic (PLEG): container finished" podID="1109b282-3ee4-4c4e-a64a-e6a22adeb6c9" containerID="789afc8d8a6e306039624650966ee23018a7ba7ea1dbcde6122e9d4057d3711b" exitCode=0 Mar 13 10:56:13.146191 master-0 kubenswrapper[17876]: I0313 10:56:13.145764 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" event={"ID":"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9","Type":"ContainerDied","Data":"789afc8d8a6e306039624650966ee23018a7ba7ea1dbcde6122e9d4057d3711b"} Mar 13 10:56:13.147380 master-0 kubenswrapper[17876]: I0313 10:56:13.147301 17876 scope.go:117] "RemoveContainer" containerID="789afc8d8a6e306039624650966ee23018a7ba7ea1dbcde6122e9d4057d3711b" Mar 13 10:56:13.154380 master-0 kubenswrapper[17876]: I0313 10:56:13.153902 17876 generic.go:334] "Generic (PLEG): container finished" podID="86a4a18e-2256-4c27-9953-1a9dca3926d6" containerID="29b7f8122b28d14392b2329bcad49a6fb43b2112d07f9c58515875136d5160c6" exitCode=0 Mar 13 10:56:13.154380 master-0 kubenswrapper[17876]: I0313 10:56:13.153994 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" event={"ID":"86a4a18e-2256-4c27-9953-1a9dca3926d6","Type":"ContainerDied","Data":"29b7f8122b28d14392b2329bcad49a6fb43b2112d07f9c58515875136d5160c6"} Mar 13 10:56:13.154573 master-0 kubenswrapper[17876]: I0313 10:56:13.154533 17876 scope.go:117] "RemoveContainer" containerID="29b7f8122b28d14392b2329bcad49a6fb43b2112d07f9c58515875136d5160c6" Mar 13 10:56:13.157823 master-0 kubenswrapper[17876]: I0313 10:56:13.157703 17876 generic.go:334] "Generic (PLEG): container finished" podID="893dac15-d6d4-4a1f-988c-59aaf9e63334" containerID="e541c073a97e968aa996efa485f9023f303d33477bd12a38bf45fb29e057d0dc" exitCode=0 Mar 13 10:56:13.158245 master-0 kubenswrapper[17876]: I0313 10:56:13.158192 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerDied","Data":"e541c073a97e968aa996efa485f9023f303d33477bd12a38bf45fb29e057d0dc"} Mar 13 10:56:13.159259 master-0 kubenswrapper[17876]: I0313 10:56:13.159223 17876 scope.go:117] "RemoveContainer" containerID="e541c073a97e968aa996efa485f9023f303d33477bd12a38bf45fb29e057d0dc" Mar 13 10:56:13.161618 master-0 kubenswrapper[17876]: I0313 10:56:13.161132 17876 generic.go:334] "Generic (PLEG): container finished" podID="ecb5bdcc-647d-4292-a33d-dc3df331c206" containerID="cc39dd97fa33d7186bc0c795b8d5e196c978cac3bdc2c8d9dbf7380009448266" exitCode=0 Mar 13 10:56:13.161618 master-0 kubenswrapper[17876]: I0313 10:56:13.161207 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerDied","Data":"cc39dd97fa33d7186bc0c795b8d5e196c978cac3bdc2c8d9dbf7380009448266"} Mar 13 10:56:13.162073 master-0 kubenswrapper[17876]: I0313 10:56:13.161938 17876 scope.go:117] "RemoveContainer" containerID="cc39dd97fa33d7186bc0c795b8d5e196c978cac3bdc2c8d9dbf7380009448266" Mar 13 10:56:13.177008 master-0 kubenswrapper[17876]: I0313 10:56:13.176966 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-z9wrg_8d2fdba3-9478-4165-9207-d01483625607/network-operator/1.log" Mar 13 10:56:13.177168 master-0 kubenswrapper[17876]: I0313 10:56:13.177026 17876 generic.go:334] "Generic (PLEG): container finished" podID="8d2fdba3-9478-4165-9207-d01483625607" containerID="f92b7dcf30e2a83f947525493e88745aa9417da1536fbf60b66ed4a133a0e4a5" exitCode=0 Mar 13 10:56:13.177168 master-0 kubenswrapper[17876]: I0313 10:56:13.177114 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerDied","Data":"f92b7dcf30e2a83f947525493e88745aa9417da1536fbf60b66ed4a133a0e4a5"} Mar 13 10:56:13.177512 master-0 kubenswrapper[17876]: I0313 10:56:13.177482 17876 scope.go:117] "RemoveContainer" containerID="f92b7dcf30e2a83f947525493e88745aa9417da1536fbf60b66ed4a133a0e4a5" Mar 13 10:56:13.207618 master-0 kubenswrapper[17876]: I0313 10:56:13.207578 17876 scope.go:117] "RemoveContainer" containerID="32f554dfe2b5d2edb99552cb7272b4f7f637a178e9e2dbe6b124630a524d92b0" Mar 13 10:56:13.208538 master-0 kubenswrapper[17876]: I0313 10:56:13.208371 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_cluster-storage-operator-6fbfc8dc8f-wz9t2_b57f1c19-f44a-4405-8135-79aef1d1ce07/cluster-storage-operator/0.log" Mar 13 10:56:13.208538 master-0 kubenswrapper[17876]: I0313 10:56:13.208414 17876 generic.go:334] "Generic (PLEG): container finished" podID="b57f1c19-f44a-4405-8135-79aef1d1ce07" containerID="8d2502ddf45dc60246cfc038c25340d355c40feb7ef15264d33e1c93664efbd3" exitCode=0 Mar 13 10:56:13.208538 master-0 kubenswrapper[17876]: I0313 10:56:13.208486 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerDied","Data":"8d2502ddf45dc60246cfc038c25340d355c40feb7ef15264d33e1c93664efbd3"} Mar 13 10:56:13.208930 master-0 kubenswrapper[17876]: I0313 10:56:13.208879 17876 scope.go:117] "RemoveContainer" containerID="8d2502ddf45dc60246cfc038c25340d355c40feb7ef15264d33e1c93664efbd3" Mar 13 10:56:13.223683 master-0 kubenswrapper[17876]: I0313 10:56:13.223638 17876 generic.go:334] "Generic (PLEG): container finished" podID="7cf7b1dc-96ab-41ef-871c-9ed5ce2db584" containerID="a37231e5cc55c1e76147d48ef0838775a990a3f28298bc163b9c8540136b0b87" exitCode=0 Mar 13 10:56:13.223925 master-0 kubenswrapper[17876]: I0313 10:56:13.223871 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerDied","Data":"a37231e5cc55c1e76147d48ef0838775a990a3f28298bc163b9c8540136b0b87"} Mar 13 10:56:13.226742 master-0 kubenswrapper[17876]: I0313 10:56:13.224482 17876 scope.go:117] "RemoveContainer" containerID="a37231e5cc55c1e76147d48ef0838775a990a3f28298bc163b9c8540136b0b87" Mar 13 10:56:13.229440 master-0 kubenswrapper[17876]: I0313 10:56:13.229399 17876 generic.go:334] "Generic (PLEG): container finished" podID="61427254-6722-4d1a-a96a-dadd24abbe94" containerID="d11003d934637dd1f9b6e8d23feaca1fc18325edb8c1c59e1375d0720a4469cd" exitCode=0 Mar 13 10:56:13.229502 master-0 kubenswrapper[17876]: I0313 10:56:13.229456 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerDied","Data":"d11003d934637dd1f9b6e8d23feaca1fc18325edb8c1c59e1375d0720a4469cd"} Mar 13 10:56:13.230030 master-0 kubenswrapper[17876]: I0313 10:56:13.229931 17876 scope.go:117] "RemoveContainer" containerID="d11003d934637dd1f9b6e8d23feaca1fc18325edb8c1c59e1375d0720a4469cd" Mar 13 10:56:13.235221 master-0 kubenswrapper[17876]: I0313 10:56:13.235187 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-olm-operator_cluster-olm-operator-77899cf6d-xvxcr_f8c7f667-d30e-41f4-8c0e-f3f138bffab4/cluster-olm-operator/0.log" Mar 13 10:56:13.238541 master-0 kubenswrapper[17876]: I0313 10:56:13.238474 17876 generic.go:334] "Generic (PLEG): container finished" podID="f8c7f667-d30e-41f4-8c0e-f3f138bffab4" containerID="2011f2a930c1149a0110b2744b7cf0ecd80491982b05c3fd36024d0672252582" exitCode=0 Mar 13 10:56:13.238633 master-0 kubenswrapper[17876]: I0313 10:56:13.238541 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerDied","Data":"2011f2a930c1149a0110b2744b7cf0ecd80491982b05c3fd36024d0672252582"} Mar 13 10:56:13.240063 master-0 kubenswrapper[17876]: I0313 10:56:13.240024 17876 scope.go:117] "RemoveContainer" containerID="2011f2a930c1149a0110b2744b7cf0ecd80491982b05c3fd36024d0672252582" Mar 13 10:56:13.289938 master-0 kubenswrapper[17876]: I0313 10:56:13.289903 17876 scope.go:117] "RemoveContainer" containerID="1fce45be6e6d39715a2674d4a14ecd62cb939d40d2e0a1372b2890dfa0404258" Mar 13 10:56:13.445167 master-0 kubenswrapper[17876]: I0313 10:56:13.445130 17876 scope.go:117] "RemoveContainer" containerID="6b5d5cf72dc30cb2bb4b67993673d5f4c06ff28bce7b145ba5ca0708943e3dea" Mar 13 10:56:13.575365 master-0 kubenswrapper[17876]: I0313 10:56:13.575208 17876 scope.go:117] "RemoveContainer" containerID="2b45cf18a0a7d8f1398d541364781f61869bca76d228c2c379591ee1130b97ba" Mar 13 10:56:13.655664 master-0 kubenswrapper[17876]: I0313 10:56:13.655625 17876 scope.go:117] "RemoveContainer" containerID="c2a4f6dd59a861840771a43677396a253f52076df338f662965d0691159c9660" Mar 13 10:56:14.247713 master-0 kubenswrapper[17876]: I0313 10:56:14.247658 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-pn89z" event={"ID":"e87ca16c-25de-4fea-b900-2960f4a5f95e","Type":"ContainerStarted","Data":"4ab6dfb9d4063231002bce5378d18a69b231b14c61676f3d2823be2e888a6c29"} Mar 13 10:56:14.249509 master-0 kubenswrapper[17876]: I0313 10:56:14.249478 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-z9wrg" event={"ID":"8d2fdba3-9478-4165-9207-d01483625607","Type":"ContainerStarted","Data":"100c4a561fd9f9372af196d61a62f2870450eebc367a329b00307ffcf7a43e52"} Mar 13 10:56:14.251339 master-0 kubenswrapper[17876]: I0313 10:56:14.251119 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-wz9t2" event={"ID":"b57f1c19-f44a-4405-8135-79aef1d1ce07","Type":"ContainerStarted","Data":"b222ebc1c2a6fcfd96a039165504510a8593eee5ee92868c6a077af6ccf0df60"} Mar 13 10:56:14.253328 master-0 kubenswrapper[17876]: I0313 10:56:14.252783 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-cvqxk" event={"ID":"7cf7b1dc-96ab-41ef-871c-9ed5ce2db584","Type":"ContainerStarted","Data":"836ed1f0b2726465b2dad9e29828e9aa1fb58b0e295e88de8488834526187ab6"} Mar 13 10:56:14.255066 master-0 kubenswrapper[17876]: I0313 10:56:14.255047 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-xvxcr" event={"ID":"f8c7f667-d30e-41f4-8c0e-f3f138bffab4","Type":"ContainerStarted","Data":"65f048354ca27793dc283180de76b32433aa39eadf1d95684a3248de7e4f7a1b"} Mar 13 10:56:14.258184 master-0 kubenswrapper[17876]: I0313 10:56:14.258133 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"36d4251d3504cdc0ec85144c1379056c","Type":"ContainerStarted","Data":"6b15111f5cc70e7a943d812e5431a2376655f27f0de42534497c9e3e68c21a05"} Mar 13 10:56:14.282210 master-0 kubenswrapper[17876]: I0313 10:56:14.277981 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-4kpg8" event={"ID":"1f358d81-87c6-40bf-89e8-5681429285f8","Type":"ContainerStarted","Data":"deb9070a5e76d300e074ff48cdaf3c3c1996e0e03a0574689a80ac5c11daf63c"} Mar 13 10:56:14.285005 master-0 kubenswrapper[17876]: I0313 10:56:14.284967 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-kxmt9" event={"ID":"ba3e43ba-2840-4612-a370-87ad3c5a382a","Type":"ContainerStarted","Data":"477f622c197e656d5cd9c93120d95f38bd898070b296cd40eb351955c7ce8e95"} Mar 13 10:56:14.292047 master-0 kubenswrapper[17876]: I0313 10:56:14.291995 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-8kd6c" event={"ID":"ecb5bdcc-647d-4292-a33d-dc3df331c206","Type":"ContainerStarted","Data":"be4c35eb9eefdb781c4b993c38fb5ee1d21a6bc73b841567c3c41780661399c6"} Mar 13 10:56:14.301728 master-0 kubenswrapper[17876]: I0313 10:56:14.301680 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-c65k4" event={"ID":"61427254-6722-4d1a-a96a-dadd24abbe94","Type":"ContainerStarted","Data":"77f7f7fa6fc13342ba0ba4e58bd80b5caae48c201d937c920fb4a1155e830dc1"} Mar 13 10:56:14.305958 master-0 kubenswrapper[17876]: I0313 10:56:14.305917 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" event={"ID":"1109b282-3ee4-4c4e-a64a-e6a22adeb6c9","Type":"ContainerStarted","Data":"da6fc191fa346c20ad1e25f3737a194b9e52aeb7fc6fd879f18a269009cf1aa8"} Mar 13 10:56:14.306703 master-0 kubenswrapper[17876]: I0313 10:56:14.306403 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:56:14.308591 master-0 kubenswrapper[17876]: I0313 10:56:14.308549 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" event={"ID":"86a4a18e-2256-4c27-9953-1a9dca3926d6","Type":"ContainerStarted","Data":"7bda8ac5496cb6a84d385581612f4667b2767a4a4295d770d155076dca96c834"} Mar 13 10:56:14.309502 master-0 kubenswrapper[17876]: I0313 10:56:14.309479 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:56:14.312134 master-0 kubenswrapper[17876]: I0313 10:56:14.312078 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-ml9xh" event={"ID":"893dac15-d6d4-4a1f-988c-59aaf9e63334","Type":"ContainerStarted","Data":"f5b5c7699174e58c3c5a48feb46f69bb37e0358b5b04653e59044cef8b124c67"} Mar 13 10:56:14.315509 master-0 kubenswrapper[17876]: I0313 10:56:14.315149 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"8b8db2cc532dce87e8f0525e63017202c20cb334885568fe9a9f6b0def42b2b3"} Mar 13 10:56:14.315910 master-0 kubenswrapper[17876]: I0313 10:56:14.315873 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:56:14.516236 master-0 kubenswrapper[17876]: I0313 10:56:14.516062 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:14.516236 master-0 kubenswrapper[17876]: I0313 10:56:14.516185 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:15.306941 master-0 kubenswrapper[17876]: I0313 10:56:15.306833 17876 patch_prober.go:28] interesting pod/console-operator-6c7fb6b958-rb7nv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:15.307833 master-0 kubenswrapper[17876]: I0313 10:56:15.306963 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" podUID="1109b282-3ee4-4c4e-a64a-e6a22adeb6c9" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:15.310297 master-0 kubenswrapper[17876]: I0313 10:56:15.310219 17876 patch_prober.go:28] interesting pod/route-controller-manager-69c7cffc4c-7h7mv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:15.310477 master-0 kubenswrapper[17876]: I0313 10:56:15.310308 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" podUID="86a4a18e-2256-4c27-9953-1a9dca3926d6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:16.330454 master-0 kubenswrapper[17876]: I0313 10:56:16.330353 17876 patch_prober.go:28] interesting pod/route-controller-manager-69c7cffc4c-7h7mv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:16.331136 master-0 kubenswrapper[17876]: I0313 10:56:16.330459 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" podUID="86a4a18e-2256-4c27-9953-1a9dca3926d6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:16.333550 master-0 kubenswrapper[17876]: I0313 10:56:16.333481 17876 patch_prober.go:28] interesting pod/console-operator-6c7fb6b958-rb7nv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:16.333670 master-0 kubenswrapper[17876]: I0313 10:56:16.333593 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" podUID="1109b282-3ee4-4c4e-a64a-e6a22adeb6c9" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:16.493700 master-0 kubenswrapper[17876]: I0313 10:56:16.493634 17876 scope.go:117] "RemoveContainer" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" Mar 13 10:56:16.494022 master-0 kubenswrapper[17876]: E0313 10:56:16.493982 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:56:19.404962 master-0 kubenswrapper[17876]: I0313 10:56:19.404845 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:19.404962 master-0 kubenswrapper[17876]: I0313 10:56:19.404947 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:19.405817 master-0 kubenswrapper[17876]: I0313 10:56:19.404987 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:19.405817 master-0 kubenswrapper[17876]: I0313 10:56:19.405009 17876 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:19.739554 master-0 kubenswrapper[17876]: I0313 10:56:19.739399 17876 patch_prober.go:28] interesting pod/console-operator-6c7fb6b958-rb7nv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:19.739554 master-0 kubenswrapper[17876]: I0313 10:56:19.739476 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" podUID="1109b282-3ee4-4c4e-a64a-e6a22adeb6c9" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.80:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:19.739903 master-0 kubenswrapper[17876]: I0313 10:56:19.739589 17876 patch_prober.go:28] interesting pod/console-operator-6c7fb6b958-rb7nv container/console-operator namespace/openshift-console-operator: Liveness probe status=failure output="Get \"https://10.128.0.80:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:19.739903 master-0 kubenswrapper[17876]: I0313 10:56:19.739626 17876 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" podUID="1109b282-3ee4-4c4e-a64a-e6a22adeb6c9" containerName="console-operator" probeResult="failure" output="Get \"https://10.128.0.80:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:20.848125 master-0 kubenswrapper[17876]: I0313 10:56:20.847997 17876 patch_prober.go:28] interesting pod/route-controller-manager-69c7cffc4c-7h7mv container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:20.848844 master-0 kubenswrapper[17876]: I0313 10:56:20.848204 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" podUID="86a4a18e-2256-4c27-9953-1a9dca3926d6" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.89:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:22.403416 master-0 kubenswrapper[17876]: I0313 10:56:22.403336 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:22.404183 master-0 kubenswrapper[17876]: I0313 10:56:22.403421 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:22.404183 master-0 kubenswrapper[17876]: I0313 10:56:22.403464 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:22.404183 master-0 kubenswrapper[17876]: I0313 10:56:22.403560 17876 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:22.404183 master-0 kubenswrapper[17876]: I0313 10:56:22.403616 17876 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:56:22.404462 master-0 kubenswrapper[17876]: I0313 10:56:22.404409 17876 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"8b8db2cc532dce87e8f0525e63017202c20cb334885568fe9a9f6b0def42b2b3"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 13 10:56:22.404533 master-0 kubenswrapper[17876]: I0313 10:56:22.404463 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" containerID="cri-o://8b8db2cc532dce87e8f0525e63017202c20cb334885568fe9a9f6b0def42b2b3" gracePeriod=30 Mar 13 10:56:23.404965 master-0 kubenswrapper[17876]: I0313 10:56:23.404897 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:23.404965 master-0 kubenswrapper[17876]: I0313 10:56:23.404970 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:24.516815 master-0 kubenswrapper[17876]: I0313 10:56:24.516745 17876 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:24.517361 master-0 kubenswrapper[17876]: I0313 10:56:24.516839 17876 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="d4c95608e26ddbbd2e5890fcd9f507b5" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:25.403992 master-0 kubenswrapper[17876]: I0313 10:56:25.403932 17876 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-pchtd container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 13 10:56:25.404241 master-0 kubenswrapper[17876]: I0313 10:56:25.403994 17876 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" podUID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.15:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 13 10:56:26.428683 master-0 kubenswrapper[17876]: I0313 10:56:26.428638 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-pchtd_3f872e59-1de1-4a95-8064-79696c73e8ab/openshift-config-operator/2.log" Mar 13 10:56:26.429551 master-0 kubenswrapper[17876]: I0313 10:56:26.429488 17876 generic.go:334] "Generic (PLEG): container finished" podID="3f872e59-1de1-4a95-8064-79696c73e8ab" containerID="8b8db2cc532dce87e8f0525e63017202c20cb334885568fe9a9f6b0def42b2b3" exitCode=255 Mar 13 10:56:26.429551 master-0 kubenswrapper[17876]: I0313 10:56:26.429526 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerDied","Data":"8b8db2cc532dce87e8f0525e63017202c20cb334885568fe9a9f6b0def42b2b3"} Mar 13 10:56:26.429690 master-0 kubenswrapper[17876]: I0313 10:56:26.429558 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" event={"ID":"3f872e59-1de1-4a95-8064-79696c73e8ab","Type":"ContainerStarted","Data":"cd12bdbaba495724ce857ebe2ec33c6c5fd4fbf7092fd3e489c375dd9ac854b0"} Mar 13 10:56:26.429690 master-0 kubenswrapper[17876]: I0313 10:56:26.429576 17876 scope.go:117] "RemoveContainer" containerID="80e219d86f62937cb95412f8c97959374104036b5299a214ae589c72f2965a63" Mar 13 10:56:26.429769 master-0 kubenswrapper[17876]: I0313 10:56:26.429744 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:56:27.447725 master-0 kubenswrapper[17876]: I0313 10:56:27.447686 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-pchtd_3f872e59-1de1-4a95-8064-79696c73e8ab/openshift-config-operator/2.log" Mar 13 10:56:28.494051 master-0 kubenswrapper[17876]: I0313 10:56:28.494006 17876 scope.go:117] "RemoveContainer" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" Mar 13 10:56:28.494599 master-0 kubenswrapper[17876]: E0313 10:56:28.494239 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kcw4k_openshift-cluster-storage-operator(84f78350-e85c-4377-97cd-9e9a1b2ff4ee)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" podUID="84f78350-e85c-4377-97cd-9e9a1b2ff4ee" Mar 13 10:56:28.746402 master-0 kubenswrapper[17876]: I0313 10:56:28.746273 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-rb7nv" Mar 13 10:56:29.219238 master-0 kubenswrapper[17876]: I0313 10:56:29.219167 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:56:29.852505 master-0 kubenswrapper[17876]: I0313 10:56:29.852435 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-69c7cffc4c-7h7mv" Mar 13 10:56:30.406210 master-0 kubenswrapper[17876]: I0313 10:56:30.406144 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-pchtd" Mar 13 10:56:31.519859 master-0 kubenswrapper[17876]: I0313 10:56:31.519794 17876 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:56:31.525778 master-0 kubenswrapper[17876]: I0313 10:56:31.525725 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 13 10:56:36.536594 master-0 kubenswrapper[17876]: I0313 10:56:36.536543 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/3.log" Mar 13 10:56:36.540247 master-0 kubenswrapper[17876]: I0313 10:56:36.540189 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/2.log" Mar 13 10:56:36.540914 master-0 kubenswrapper[17876]: I0313 10:56:36.540864 17876 generic.go:334] "Generic (PLEG): container finished" podID="0881de70-2db3-4fc2-b976-b55c11dc239d" containerID="98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf" exitCode=1 Mar 13 10:56:36.540982 master-0 kubenswrapper[17876]: I0313 10:56:36.540924 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerDied","Data":"98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf"} Mar 13 10:56:36.541062 master-0 kubenswrapper[17876]: I0313 10:56:36.541027 17876 scope.go:117] "RemoveContainer" containerID="a2a3a6dbc09acd3b5c004ca8d99a0995e19ad749f409ecb1761d51cda96db999" Mar 13 10:56:36.541835 master-0 kubenswrapper[17876]: I0313 10:56:36.541805 17876 scope.go:117] "RemoveContainer" containerID="98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf" Mar 13 10:56:36.542529 master-0 kubenswrapper[17876]: E0313 10:56:36.542500 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-2c4sl_openshift-machine-api(0881de70-2db3-4fc2-b976-b55c11dc239d)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" podUID="0881de70-2db3-4fc2-b976-b55c11dc239d" Mar 13 10:56:37.550882 master-0 kubenswrapper[17876]: I0313 10:56:37.550772 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/3.log" Mar 13 10:56:40.494581 master-0 kubenswrapper[17876]: I0313 10:56:40.494433 17876 scope.go:117] "RemoveContainer" containerID="c5dfd744b884ff656fb7c75fbafdd1de651c02c8eca1ef3e5809607ce3c46748" Mar 13 10:56:41.847877 master-0 kubenswrapper[17876]: I0313 10:56:41.847828 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kcw4k_84f78350-e85c-4377-97cd-9e9a1b2ff4ee/snapshot-controller/4.log" Mar 13 10:56:41.848771 master-0 kubenswrapper[17876]: I0313 10:56:41.847929 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kcw4k" event={"ID":"84f78350-e85c-4377-97cd-9e9a1b2ff4ee","Type":"ContainerStarted","Data":"0ae48d2aae8de733c459c2af47d9fbfb61b783766db639f539e5bf495ae97e70"} Mar 13 10:56:47.615209 master-0 kubenswrapper[17876]: I0313 10:56:47.615109 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:56:47.616288 master-0 kubenswrapper[17876]: I0313 10:56:47.615481 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="telemeter-client" containerID="cri-o://908b7375b91dcf2baab20c05c0f72cd3fc66c40e43b8f4f484d6e4c6f9345dde" gracePeriod=30 Mar 13 10:56:47.616288 master-0 kubenswrapper[17876]: I0313 10:56:47.616044 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="kube-rbac-proxy" containerID="cri-o://7a15eb5240bfdd7af38a32fca27f9dc99472114c912a39deca7e2c742392f9c0" gracePeriod=30 Mar 13 10:56:47.616288 master-0 kubenswrapper[17876]: I0313 10:56:47.616104 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="reload" containerID="cri-o://e79967d0b76f0c0562a0acc8d7863ef6baf6011d87f652e4e181a797f0f82df2" gracePeriod=30 Mar 13 10:56:47.948123 master-0 kubenswrapper[17876]: I0313 10:56:47.945399 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6644589945-r7t4l_84d7119d-7c04-4168-9008-83414ea5d79e/telemeter-client/0.log" Mar 13 10:56:47.948123 master-0 kubenswrapper[17876]: I0313 10:56:47.945460 17876 generic.go:334] "Generic (PLEG): container finished" podID="84d7119d-7c04-4168-9008-83414ea5d79e" containerID="e79967d0b76f0c0562a0acc8d7863ef6baf6011d87f652e4e181a797f0f82df2" exitCode=0 Mar 13 10:56:47.948123 master-0 kubenswrapper[17876]: I0313 10:56:47.945476 17876 generic.go:334] "Generic (PLEG): container finished" podID="84d7119d-7c04-4168-9008-83414ea5d79e" containerID="908b7375b91dcf2baab20c05c0f72cd3fc66c40e43b8f4f484d6e4c6f9345dde" exitCode=2 Mar 13 10:56:47.948123 master-0 kubenswrapper[17876]: I0313 10:56:47.945518 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerDied","Data":"e79967d0b76f0c0562a0acc8d7863ef6baf6011d87f652e4e181a797f0f82df2"} Mar 13 10:56:47.948123 master-0 kubenswrapper[17876]: I0313 10:56:47.945594 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerDied","Data":"908b7375b91dcf2baab20c05c0f72cd3fc66c40e43b8f4f484d6e4c6f9345dde"} Mar 13 10:56:48.495487 master-0 kubenswrapper[17876]: I0313 10:56:48.495438 17876 scope.go:117] "RemoveContainer" containerID="98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf" Mar 13 10:56:48.495738 master-0 kubenswrapper[17876]: E0313 10:56:48.495709 17876 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-2c4sl_openshift-machine-api(0881de70-2db3-4fc2-b976-b55c11dc239d)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" podUID="0881de70-2db3-4fc2-b976-b55c11dc239d" Mar 13 10:56:49.128123 master-0 kubenswrapper[17876]: I0313 10:56:49.123246 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6644589945-r7t4l_84d7119d-7c04-4168-9008-83414ea5d79e/telemeter-client/0.log" Mar 13 10:56:49.128123 master-0 kubenswrapper[17876]: I0313 10:56:49.123302 17876 generic.go:334] "Generic (PLEG): container finished" podID="84d7119d-7c04-4168-9008-83414ea5d79e" containerID="7a15eb5240bfdd7af38a32fca27f9dc99472114c912a39deca7e2c742392f9c0" exitCode=0 Mar 13 10:56:49.128123 master-0 kubenswrapper[17876]: I0313 10:56:49.123332 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerDied","Data":"7a15eb5240bfdd7af38a32fca27f9dc99472114c912a39deca7e2c742392f9c0"} Mar 13 10:56:49.592384 master-0 kubenswrapper[17876]: I0313 10:56:49.592326 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6644589945-r7t4l_84d7119d-7c04-4168-9008-83414ea5d79e/telemeter-client/0.log" Mar 13 10:56:49.592664 master-0 kubenswrapper[17876]: I0313 10:56:49.592440 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.719773 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.719895 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720046 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720179 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720281 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720431 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720480 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925405 master-0 kubenswrapper[17876]: I0313 10:56:49.720537 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzl9p\" (UniqueName: \"kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p\") pod \"84d7119d-7c04-4168-9008-83414ea5d79e\" (UID: \"84d7119d-7c04-4168-9008-83414ea5d79e\") " Mar 13 10:56:49.925974 master-0 kubenswrapper[17876]: I0313 10:56:49.925609 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle" (OuterVolumeSpecName: "serving-certs-ca-bundle") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:49.929577 master-0 kubenswrapper[17876]: I0313 10:56:49.929537 17876 reconciler_common.go:293] "Volume detached for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:49.931941 master-0 kubenswrapper[17876]: I0313 10:56:49.931893 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:49.934856 master-0 kubenswrapper[17876]: I0313 10:56:49.934791 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle" (OuterVolumeSpecName: "telemeter-trusted-ca-bundle") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "telemeter-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:49.938151 master-0 kubenswrapper[17876]: I0313 10:56:49.937878 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config" (OuterVolumeSpecName: "secret-telemeter-client-kube-rbac-proxy-config") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "secret-telemeter-client-kube-rbac-proxy-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:49.948803 master-0 kubenswrapper[17876]: I0313 10:56:49.942270 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls" (OuterVolumeSpecName: "federate-client-tls") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "federate-client-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:49.948803 master-0 kubenswrapper[17876]: I0313 10:56:49.942288 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client" (OuterVolumeSpecName: "secret-telemeter-client") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "secret-telemeter-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:49.948803 master-0 kubenswrapper[17876]: I0313 10:56:49.944876 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p" (OuterVolumeSpecName: "kube-api-access-jzl9p") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "kube-api-access-jzl9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:56:49.960897 master-0 kubenswrapper[17876]: I0313 10:56:49.960692 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls" (OuterVolumeSpecName: "telemeter-client-tls") pod "84d7119d-7c04-4168-9008-83414ea5d79e" (UID: "84d7119d-7c04-4168-9008-83414ea5d79e"). InnerVolumeSpecName "telemeter-client-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:50.030810 master-0 kubenswrapper[17876]: I0313 10:56:50.030729 17876 reconciler_common.go:293] "Volume detached for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.030810 master-0 kubenswrapper[17876]: I0313 10:56:50.030787 17876 reconciler_common.go:293] "Volume detached for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-secret-telemeter-client-kube-rbac-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.031120 master-0 kubenswrapper[17876]: I0313 10:56:50.030807 17876 reconciler_common.go:293] "Volume detached for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-client-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.031120 master-0 kubenswrapper[17876]: I0313 10:56:50.030841 17876 reconciler_common.go:293] "Volume detached for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/84d7119d-7c04-4168-9008-83414ea5d79e-federate-client-tls\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.031120 master-0 kubenswrapper[17876]: I0313 10:56:50.030853 17876 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.031120 master-0 kubenswrapper[17876]: I0313 10:56:50.030867 17876 reconciler_common.go:293] "Volume detached for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/84d7119d-7c04-4168-9008-83414ea5d79e-telemeter-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.031120 master-0 kubenswrapper[17876]: I0313 10:56:50.030878 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzl9p\" (UniqueName: \"kubernetes.io/projected/84d7119d-7c04-4168-9008-83414ea5d79e-kube-api-access-jzl9p\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:50.226209 master-0 kubenswrapper[17876]: I0313 10:56:50.222894 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-6644589945-r7t4l_84d7119d-7c04-4168-9008-83414ea5d79e/telemeter-client/0.log" Mar 13 10:56:50.226209 master-0 kubenswrapper[17876]: I0313 10:56:50.223002 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" event={"ID":"84d7119d-7c04-4168-9008-83414ea5d79e","Type":"ContainerDied","Data":"ef7d8004c639f1641cf26738f7adb78f5d72c09997b6c7788ef2750d25c63a07"} Mar 13 10:56:50.226209 master-0 kubenswrapper[17876]: I0313 10:56:50.223136 17876 scope.go:117] "RemoveContainer" containerID="7a15eb5240bfdd7af38a32fca27f9dc99472114c912a39deca7e2c742392f9c0" Mar 13 10:56:50.226209 master-0 kubenswrapper[17876]: I0313 10:56:50.223458 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-r7t4l" Mar 13 10:56:50.322128 master-0 kubenswrapper[17876]: I0313 10:56:50.318362 17876 scope.go:117] "RemoveContainer" containerID="e79967d0b76f0c0562a0acc8d7863ef6baf6011d87f652e4e181a797f0f82df2" Mar 13 10:56:50.333649 master-0 kubenswrapper[17876]: I0313 10:56:50.333547 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:56:50.340471 master-0 kubenswrapper[17876]: I0313 10:56:50.340420 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-r7t4l"] Mar 13 10:56:50.350652 master-0 kubenswrapper[17876]: I0313 10:56:50.349372 17876 scope.go:117] "RemoveContainer" containerID="908b7375b91dcf2baab20c05c0f72cd3fc66c40e43b8f4f484d6e4c6f9345dde" Mar 13 10:56:50.502950 master-0 kubenswrapper[17876]: I0313 10:56:50.502813 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" path="/var/lib/kubelet/pods/84d7119d-7c04-4168-9008-83414ea5d79e/volumes" Mar 13 10:56:54.260998 master-0 kubenswrapper[17876]: I0313 10:56:54.260934 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" podUID="0673d5a0-3ff3-4d30-995b-829d3f165071" containerName="oauth-openshift" containerID="cri-o://f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7" gracePeriod=15 Mar 13 10:56:54.832224 master-0 kubenswrapper[17876]: I0313 10:56:54.832032 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:56:54.950771 master-0 kubenswrapper[17876]: I0313 10:56:54.950704 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951165 master-0 kubenswrapper[17876]: I0313 10:56:54.951143 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951339 master-0 kubenswrapper[17876]: I0313 10:56:54.951313 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951576 master-0 kubenswrapper[17876]: I0313 10:56:54.951227 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 13 10:56:54.951576 master-0 kubenswrapper[17876]: I0313 10:56:54.951522 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951752 master-0 kubenswrapper[17876]: I0313 10:56:54.951626 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951752 master-0 kubenswrapper[17876]: I0313 10:56:54.951714 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951752 master-0 kubenswrapper[17876]: I0313 10:56:54.951749 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951888 master-0 kubenswrapper[17876]: I0313 10:56:54.951792 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951888 master-0 kubenswrapper[17876]: I0313 10:56:54.951818 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcl2w\" (UniqueName: \"kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951888 master-0 kubenswrapper[17876]: I0313 10:56:54.951850 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.951888 master-0 kubenswrapper[17876]: I0313 10:56:54.951875 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.952223 master-0 kubenswrapper[17876]: I0313 10:56:54.951902 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.952223 master-0 kubenswrapper[17876]: I0313 10:56:54.951926 17876 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca\") pod \"0673d5a0-3ff3-4d30-995b-829d3f165071\" (UID: \"0673d5a0-3ff3-4d30-995b-829d3f165071\") " Mar 13 10:56:54.952223 master-0 kubenswrapper[17876]: I0313 10:56:54.952174 17876 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:54.952599 master-0 kubenswrapper[17876]: I0313 10:56:54.951832 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:54.952599 master-0 kubenswrapper[17876]: I0313 10:56:54.952056 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:54.952599 master-0 kubenswrapper[17876]: I0313 10:56:54.952578 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:54.952761 master-0 kubenswrapper[17876]: I0313 10:56:54.952592 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 13 10:56:54.954804 master-0 kubenswrapper[17876]: I0313 10:56:54.954766 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.955419 master-0 kubenswrapper[17876]: I0313 10:56:54.955385 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.955751 master-0 kubenswrapper[17876]: I0313 10:56:54.955674 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.955751 master-0 kubenswrapper[17876]: I0313 10:56:54.955718 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.955989 master-0 kubenswrapper[17876]: I0313 10:56:54.955964 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.956578 master-0 kubenswrapper[17876]: I0313 10:56:54.956447 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:54.957292 master-0 kubenswrapper[17876]: I0313 10:56:54.957220 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w" (OuterVolumeSpecName: "kube-api-access-qcl2w") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "kube-api-access-qcl2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 13 10:56:54.960543 master-0 kubenswrapper[17876]: I0313 10:56:54.960464 17876 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0673d5a0-3ff3-4d30-995b-829d3f165071" (UID: "0673d5a0-3ff3-4d30-995b-829d3f165071"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052843 17876 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052913 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052930 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052943 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052957 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052971 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052985 17876 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcl2w\" (UniqueName: \"kubernetes.io/projected/0673d5a0-3ff3-4d30-995b-829d3f165071-kube-api-access-qcl2w\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.052996 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.053010 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.053021 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.053034 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.053075 master-0 kubenswrapper[17876]: I0313 10:56:55.053046 17876 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0673d5a0-3ff3-4d30-995b-829d3f165071-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 13 10:56:55.265432 master-0 kubenswrapper[17876]: I0313 10:56:55.265369 17876 generic.go:334] "Generic (PLEG): container finished" podID="0673d5a0-3ff3-4d30-995b-829d3f165071" containerID="f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7" exitCode=0 Mar 13 10:56:55.266040 master-0 kubenswrapper[17876]: I0313 10:56:55.265423 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" event={"ID":"0673d5a0-3ff3-4d30-995b-829d3f165071","Type":"ContainerDied","Data":"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7"} Mar 13 10:56:55.266040 master-0 kubenswrapper[17876]: I0313 10:56:55.265465 17876 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" Mar 13 10:56:55.266040 master-0 kubenswrapper[17876]: I0313 10:56:55.265493 17876 scope.go:117] "RemoveContainer" containerID="f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7" Mar 13 10:56:55.266040 master-0 kubenswrapper[17876]: I0313 10:56:55.265480 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6" event={"ID":"0673d5a0-3ff3-4d30-995b-829d3f165071","Type":"ContainerDied","Data":"7540fa4219d6a8910c174ebbf42745831c7ed3ff23c260516d4014f136e1f42c"} Mar 13 10:56:55.287985 master-0 kubenswrapper[17876]: I0313 10:56:55.287950 17876 scope.go:117] "RemoveContainer" containerID="f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7" Mar 13 10:56:55.288635 master-0 kubenswrapper[17876]: E0313 10:56:55.288586 17876 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7\": container with ID starting with f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7 not found: ID does not exist" containerID="f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7" Mar 13 10:56:55.288699 master-0 kubenswrapper[17876]: I0313 10:56:55.288641 17876 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7"} err="failed to get container status \"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7\": rpc error: code = NotFound desc = could not find container \"f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7\": container with ID starting with f63a7109f6967495bf0b00018639e9ead7347da770635bef138645e8c3880ad7 not found: ID does not exist" Mar 13 10:56:55.306324 master-0 kubenswrapper[17876]: I0313 10:56:55.306251 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:56:55.317596 master-0 kubenswrapper[17876]: I0313 10:56:55.317511 17876 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-8555c5bbdd-kbpw6"] Mar 13 10:56:56.503805 master-0 kubenswrapper[17876]: I0313 10:56:56.503716 17876 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0673d5a0-3ff3-4d30-995b-829d3f165071" path="/var/lib/kubelet/pods/0673d5a0-3ff3-4d30-995b-829d3f165071/volumes" Mar 13 10:56:59.494582 master-0 kubenswrapper[17876]: I0313 10:56:59.494513 17876 scope.go:117] "RemoveContainer" containerID="98ef60368c7b2c21e223da3e64f5b0b3b589d9ad858fd40c4b2a65df1c36e8cf" Mar 13 10:57:00.308692 master-0 kubenswrapper[17876]: I0313 10:57:00.308625 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-2c4sl_0881de70-2db3-4fc2-b976-b55c11dc239d/cluster-baremetal-operator/3.log" Mar 13 10:57:00.309067 master-0 kubenswrapper[17876]: I0313 10:57:00.309024 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-2c4sl" event={"ID":"0881de70-2db3-4fc2-b976-b55c11dc239d","Type":"ContainerStarted","Data":"5e0b4f44d93060557c37756b821f1e93aca12533bc6233f7b9ceca65e948cb97"} Mar 13 10:58:24.479231 master-0 kubenswrapper[17876]: I0313 10:58:24.479051 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-8kd6c_ecb5bdcc-647d-4292-a33d-dc3df331c206/authentication-operator/0.log" Mar 13 10:58:24.679128 master-0 kubenswrapper[17876]: I0313 10:58:24.677802 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-8kd6c_ecb5bdcc-647d-4292-a33d-dc3df331c206/authentication-operator/1.log" Mar 13 10:58:24.867137 master-0 kubenswrapper[17876]: I0313 10:58:24.866924 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-79f8cd6fdd-mbkch_94f7921a-6d0f-45b7-ba8f-9f2ef74b044e/router/0.log" Mar 13 10:58:25.058857 master-0 kubenswrapper[17876]: I0313 10:58:25.058799 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-999d99f5f-hlk52_a9258b0f-fdcc-4bfa-b982-5cf3c899c432/fix-audit-permissions/0.log" Mar 13 10:58:25.263509 master-0 kubenswrapper[17876]: I0313 10:58:25.263446 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-999d99f5f-hlk52_a9258b0f-fdcc-4bfa-b982-5cf3c899c432/oauth-apiserver/0.log" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647194 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdm8d/must-gather-r2l86"] Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: E0313 10:59:42.647685 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de9eb09a-0b9b-4190-b3ce-7eb971c93fae" containerName="installer" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647710 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="de9eb09a-0b9b-4190-b3ce-7eb971c93fae" containerName="installer" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: E0313 10:59:42.647747 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="reload" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647756 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="reload" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: E0313 10:59:42.647790 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="telemeter-client" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647798 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="telemeter-client" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: E0313 10:59:42.647809 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0673d5a0-3ff3-4d30-995b-829d3f165071" containerName="oauth-openshift" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647818 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="0673d5a0-3ff3-4d30-995b-829d3f165071" containerName="oauth-openshift" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: E0313 10:59:42.647831 17876 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="kube-rbac-proxy" Mar 13 10:59:42.647956 master-0 kubenswrapper[17876]: I0313 10:59:42.647838 17876 state_mem.go:107] "Deleted CPUSet assignment" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="kube-rbac-proxy" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.648060 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="de9eb09a-0b9b-4190-b3ce-7eb971c93fae" containerName="installer" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.648087 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="telemeter-client" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.648132 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="0673d5a0-3ff3-4d30-995b-829d3f165071" containerName="oauth-openshift" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.648166 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="reload" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.648180 17876 memory_manager.go:354] "RemoveStaleState removing state" podUID="84d7119d-7c04-4168-9008-83414ea5d79e" containerName="kube-rbac-proxy" Mar 13 10:59:42.650242 master-0 kubenswrapper[17876]: I0313 10:59:42.649265 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.656431 master-0 kubenswrapper[17876]: I0313 10:59:42.654011 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fdm8d"/"kube-root-ca.crt" Mar 13 10:59:42.657344 master-0 kubenswrapper[17876]: I0313 10:59:42.657302 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fdm8d"/"openshift-service-ca.crt" Mar 13 10:59:42.662068 master-0 kubenswrapper[17876]: I0313 10:59:42.661658 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl"] Mar 13 10:59:42.662859 master-0 kubenswrapper[17876]: I0313 10:59:42.662823 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.667442 master-0 kubenswrapper[17876]: I0313 10:59:42.667377 17876 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 13 10:59:42.667723 master-0 kubenswrapper[17876]: I0313 10:59:42.667609 17876 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 13 10:59:42.675036 master-0 kubenswrapper[17876]: I0313 10:59:42.674843 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-89944f9bc-f5dt2"] Mar 13 10:59:42.677874 master-0 kubenswrapper[17876]: I0313 10:59:42.677590 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688030 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-kkkpw" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688042 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688283 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688420 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688535 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688619 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.688648 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.689020 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.689238 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.690396 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 13 10:59:42.691146 master-0 kubenswrapper[17876]: I0313 10:59:42.690670 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 13 10:59:42.691865 master-0 kubenswrapper[17876]: I0313 10:59:42.691412 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 13 10:59:42.707126 master-0 kubenswrapper[17876]: I0313 10:59:42.695378 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-b8bb4856b-64pzc"] Mar 13 10:59:42.710833 master-0 kubenswrapper[17876]: I0313 10:59:42.710786 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 13 10:59:42.712668 master-0 kubenswrapper[17876]: I0313 10:59:42.712643 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-mndjh"] Mar 13 10:59:42.713904 master-0 kubenswrapper[17876]: I0313 10:59:42.713882 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:42.714562 master-0 kubenswrapper[17876]: I0313 10:59:42.714545 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.719312 master-0 kubenswrapper[17876]: I0313 10:59:42.719035 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 13 10:59:42.721823 master-0 kubenswrapper[17876]: I0313 10:59:42.721767 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 13 10:59:42.722178 master-0 kubenswrapper[17876]: I0313 10:59:42.722150 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 13 10:59:42.722414 master-0 kubenswrapper[17876]: I0313 10:59:42.722384 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864719 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-error\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864803 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864856 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnwjr\" (UniqueName: \"kubernetes.io/projected/244bdfab-08d8-4388-9791-b9fb3ea1a63e-kube-api-access-qnwjr\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864898 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864939 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-login\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.864963 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hgcnt\" (UniqueName: \"kubernetes.io/projected/7553f594-9ac7-4a26-8df1-fe7dd957681b-kube-api-access-hgcnt\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.865011 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-policies\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.865047 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7553f594-9ac7-4a26-8df1-fe7dd957681b-must-gather-output\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.875414 master-0 kubenswrapper[17876]: I0313 10:59:42.865071 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877310 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877431 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4xkx\" (UniqueName: \"kubernetes.io/projected/b5fb8df2-3f81-41d8-b588-7966f8e499f0-kube-api-access-w4xkx\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877492 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-session\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877569 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877649 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-dir\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877688 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-webhook-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877754 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877805 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-apiservice-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.879126 master-0 kubenswrapper[17876]: I0313 10:59:42.877869 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.923321 master-0 kubenswrapper[17876]: I0313 10:59:42.922603 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dcljb"] Mar 13 10:59:42.925483 master-0 kubenswrapper[17876]: I0313 10:59:42.923925 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:42.945260 master-0 kubenswrapper[17876]: I0313 10:59:42.945213 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b"] Mar 13 10:59:42.946600 master-0 kubenswrapper[17876]: I0313 10:59:42.946579 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:42.964401 master-0 kubenswrapper[17876]: I0313 10:59:42.952581 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 13 10:59:42.973125 master-0 kubenswrapper[17876]: I0313 10:59:42.970245 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wk7kn"] Mar 13 10:59:42.973125 master-0 kubenswrapper[17876]: I0313 10:59:42.971756 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980089 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4nrh\" (UniqueName: \"kubernetes.io/projected/1a552e5f-3a19-4d3d-91ae-1b979910b87c-kube-api-access-m4nrh\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980183 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980212 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980271 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-error\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980302 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8gng\" (UniqueName: \"kubernetes.io/projected/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-kube-api-access-t8gng\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980328 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980358 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnwjr\" (UniqueName: \"kubernetes.io/projected/244bdfab-08d8-4388-9791-b9fb3ea1a63e-kube-api-access-qnwjr\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980386 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980408 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-oauth-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980432 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-service-ca\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980470 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hgcnt\" (UniqueName: \"kubernetes.io/projected/7553f594-9ac7-4a26-8df1-fe7dd957681b-kube-api-access-hgcnt\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980513 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-login\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980553 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-policies\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980597 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7553f594-9ac7-4a26-8df1-fe7dd957681b-must-gather-output\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980630 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980659 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980689 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980715 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980745 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4xkx\" (UniqueName: \"kubernetes.io/projected/b5fb8df2-3f81-41d8-b588-7966f8e499f0-kube-api-access-w4xkx\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980773 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-session\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980807 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-trusted-ca-bundle\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980835 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.980861 master-0 kubenswrapper[17876]: I0313 10:59:42.980868 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-oauth-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.980920 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbmz2\" (UniqueName: \"kubernetes.io/projected/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-kube-api-access-lbmz2\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.980959 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-dir\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.980989 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-webhook-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.981048 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.981081 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-apiservice-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.981144 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.981175 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:42.981788 master-0 kubenswrapper[17876]: I0313 10:59:42.981205 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:42.984443 master-0 kubenswrapper[17876]: I0313 10:59:42.982186 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-policies\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.984443 master-0 kubenswrapper[17876]: I0313 10:59:42.982263 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b5fb8df2-3f81-41d8-b588-7966f8e499f0-audit-dir\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.984443 master-0 kubenswrapper[17876]: I0313 10:59:42.983821 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.984443 master-0 kubenswrapper[17876]: I0313 10:59:42.984137 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/7553f594-9ac7-4a26-8df1-fe7dd957681b-must-gather-output\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:42.984443 master-0 kubenswrapper[17876]: I0313 10:59:42.984292 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l8t7m"] Mar 13 10:59:42.984677 master-0 kubenswrapper[17876]: I0313 10:59:42.984639 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 13 10:59:42.994255 master-0 kubenswrapper[17876]: I0313 10:59:42.985028 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 13 10:59:42.994255 master-0 kubenswrapper[17876]: I0313 10:59:42.985172 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-login\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:42.994255 master-0 kubenswrapper[17876]: I0313 10:59:42.992659 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.001369 master-0 kubenswrapper[17876]: I0313 10:59:42.999332 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-service-ca\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.001369 master-0 kubenswrapper[17876]: I0313 10:59:43.000193 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.003460 master-0 kubenswrapper[17876]: I0313 10:59:43.003414 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-router-certs\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.018023 master-0 kubenswrapper[17876]: I0313 10:59:43.014194 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.018667 master-0 kubenswrapper[17876]: I0313 10:59:43.018602 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.020230 master-0 kubenswrapper[17876]: I0313 10:59:43.019146 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.020230 master-0 kubenswrapper[17876]: I0313 10:59:43.019304 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-webhook-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:43.022145 master-0 kubenswrapper[17876]: I0313 10:59:43.022060 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-user-template-error\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.031814 master-0 kubenswrapper[17876]: I0313 10:59:43.030684 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/244bdfab-08d8-4388-9791-b9fb3ea1a63e-apiservice-cert\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:43.033817 master-0 kubenswrapper[17876]: I0313 10:59:43.033420 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-tkl8r"] Mar 13 10:59:43.039041 master-0 kubenswrapper[17876]: I0313 10:59:43.038990 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b5fb8df2-3f81-41d8-b588-7966f8e499f0-v4-0-config-system-session\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.047590 master-0 kubenswrapper[17876]: I0313 10:59:43.041835 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.047590 master-0 kubenswrapper[17876]: I0313 10:59:43.044622 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 13 10:59:43.047590 master-0 kubenswrapper[17876]: I0313 10:59:43.044860 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 13 10:59:43.047590 master-0 kubenswrapper[17876]: I0313 10:59:43.045685 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 13 10:59:43.047590 master-0 kubenswrapper[17876]: I0313 10:59:43.046758 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4xkx\" (UniqueName: \"kubernetes.io/projected/b5fb8df2-3f81-41d8-b588-7966f8e499f0-kube-api-access-w4xkx\") pod \"oauth-openshift-89944f9bc-f5dt2\" (UID: \"b5fb8df2-3f81-41d8-b588-7966f8e499f0\") " pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.049746 master-0 kubenswrapper[17876]: I0313 10:59:43.049011 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 13 10:59:43.049855 master-0 kubenswrapper[17876]: I0313 10:59:43.049723 17876 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 13 10:59:43.057688 master-0 kubenswrapper[17876]: I0313 10:59:43.057643 17876 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 13 10:59:43.058570 master-0 kubenswrapper[17876]: I0313 10:59:43.058509 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fdm8d/must-gather-7szn2"] Mar 13 10:59:43.074238 master-0 kubenswrapper[17876]: I0313 10:59:43.073408 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-l5ddw"] Mar 13 10:59:43.074458 master-0 kubenswrapper[17876]: I0313 10:59:43.074314 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.079454 master-0 kubenswrapper[17876]: I0313 10:59:43.079390 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.082841 master-0 kubenswrapper[17876]: I0313 10:59:43.082435 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnwjr\" (UniqueName: \"kubernetes.io/projected/244bdfab-08d8-4388-9791-b9fb3ea1a63e-kube-api-access-qnwjr\") pod \"metallb-operator-webhook-server-9c6b5c975-mh7gl\" (UID: \"244bdfab-08d8-4388-9791-b9fb3ea1a63e\") " pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:43.084085 master-0 kubenswrapper[17876]: I0313 10:59:43.084020 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:43.084170 master-0 kubenswrapper[17876]: I0313 10:59:43.084128 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.084210 master-0 kubenswrapper[17876]: I0313 10:59:43.084165 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.084240 master-0 kubenswrapper[17876]: I0313 10:59:43.084204 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-metrics-client-ca\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.084270 master-0 kubenswrapper[17876]: I0313 10:59:43.084239 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ffcpd\" (UniqueName: \"kubernetes.io/projected/41f9cf46-25ab-4720-b133-7d569e65e625-kube-api-access-ffcpd\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.084814 master-0 kubenswrapper[17876]: I0313 10:59:43.084777 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkbj4\" (UniqueName: \"kubernetes.io/projected/f761e3f1-1350-443e-95dc-10fde5d5ded6-kube-api-access-fkbj4\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.084887 master-0 kubenswrapper[17876]: I0313 10:59:43.084824 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.084887 master-0 kubenswrapper[17876]: I0313 10:59:43.084854 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-trusted-ca-bundle\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.084957 master-0 kubenswrapper[17876]: I0313 10:59:43.084877 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbmz2\" (UniqueName: \"kubernetes.io/projected/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-kube-api-access-lbmz2\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:43.084957 master-0 kubenswrapper[17876]: I0313 10:59:43.084913 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.084957 master-0 kubenswrapper[17876]: I0313 10:59:43.084947 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.085055 master-0 kubenswrapper[17876]: I0313 10:59:43.084972 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:43.085055 master-0 kubenswrapper[17876]: I0313 10:59:43.084993 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4nrh\" (UniqueName: \"kubernetes.io/projected/1a552e5f-3a19-4d3d-91ae-1b979910b87c-kube-api-access-m4nrh\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.085055 master-0 kubenswrapper[17876]: I0313 10:59:43.085013 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.086046 master-0 kubenswrapper[17876]: I0313 10:59:43.086004 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-trusted-ca-bundle\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.086669 master-0 kubenswrapper[17876]: I0313 10:59:43.086625 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hgcnt\" (UniqueName: \"kubernetes.io/projected/7553f594-9ac7-4a26-8df1-fe7dd957681b-kube-api-access-hgcnt\") pod \"must-gather-r2l86\" (UID: \"7553f594-9ac7-4a26-8df1-fe7dd957681b\") " pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:43.086669 master-0 kubenswrapper[17876]: I0313 10:59:43.086657 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.087394 master-0 kubenswrapper[17876]: I0313 10:59:43.087362 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-openshift-service-ca\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:43.087469 master-0 kubenswrapper[17876]: I0313 10:59:43.085449 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8gng\" (UniqueName: \"kubernetes.io/projected/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-kube-api-access-t8gng\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:43.087606 master-0 kubenswrapper[17876]: I0313 10:59:43.087584 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-oauth-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.087756 master-0 kubenswrapper[17876]: I0313 10:59:43.087622 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-service-ca\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.087805 master-0 kubenswrapper[17876]: I0313 10:59:43.087775 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.088795 master-0 kubenswrapper[17876]: I0313 10:59:43.088739 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.089858 master-0 kubenswrapper[17876]: I0313 10:59:43.089830 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k9mwv\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-kube-api-access-k9mwv\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.089998 master-0 kubenswrapper[17876]: I0313 10:59:43.089978 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-federate-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.090119 master-0 kubenswrapper[17876]: I0313 10:59:43.090088 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.090235 master-0 kubenswrapper[17876]: I0313 10:59:43.090222 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f761e3f1-1350-443e-95dc-10fde5d5ded6-must-gather-output\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.090343 master-0 kubenswrapper[17876]: I0313 10:59:43.090330 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-oauth-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.090477 master-0 kubenswrapper[17876]: I0313 10:59:43.090464 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.090619 master-0 kubenswrapper[17876]: I0313 10:59:43.090601 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:43.090745 master-0 kubenswrapper[17876]: I0313 10:59:43.090730 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:43.090876 master-0 kubenswrapper[17876]: I0313 10:59:43.090858 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tcssm\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-kube-api-access-tcssm\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.093041 master-0 kubenswrapper[17876]: I0313 10:59:43.093003 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:43.093880 master-0 kubenswrapper[17876]: I0313 10:59:43.093858 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-service-ca\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.098842 master-0 kubenswrapper[17876]: I0313 10:59:43.098493 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1a552e5f-3a19-4d3d-91ae-1b979910b87c-console-oauth-config\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.100189 master-0 kubenswrapper[17876]: I0313 10:59:43.099497 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1a552e5f-3a19-4d3d-91ae-1b979910b87c-oauth-serving-cert\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.101909 master-0 kubenswrapper[17876]: I0313 10:59:43.101862 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-observability-operator-tls\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:43.107883 master-0 kubenswrapper[17876]: I0313 10:59:43.107853 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbmz2\" (UniqueName: \"kubernetes.io/projected/d1708115-95b0-47fc-a0f3-0fbdb3786cd4-kube-api-access-lbmz2\") pod \"perses-operator-5bf474d74f-dcljb\" (UID: \"d1708115-95b0-47fc-a0f3-0fbdb3786cd4\") " pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:43.109938 master-0 kubenswrapper[17876]: I0313 10:59:43.109898 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b\" (UID: \"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:43.113892 master-0 kubenswrapper[17876]: I0313 10:59:43.113855 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8gng\" (UniqueName: \"kubernetes.io/projected/fb2ed9c3-da42-4b9a-a79f-3869cf1076e9-kube-api-access-t8gng\") pod \"observability-operator-59bdc8b94-mndjh\" (UID: \"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9\") " pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:43.138189 master-0 kubenswrapper[17876]: I0313 10:59:43.137901 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4nrh\" (UniqueName: \"kubernetes.io/projected/1a552e5f-3a19-4d3d-91ae-1b979910b87c-kube-api-access-m4nrh\") pod \"console-b8bb4856b-64pzc\" (UID: \"1a552e5f-3a19-4d3d-91ae-1b979910b87c\") " pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.149185 master-0 kubenswrapper[17876]: I0313 10:59:43.148832 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" Mar 13 10:59:43.149504 master-0 kubenswrapper[17876]: I0313 10:59:43.149204 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84"] Mar 13 10:59:43.150470 master-0 kubenswrapper[17876]: I0313 10:59:43.150403 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.165534 master-0 kubenswrapper[17876]: I0313 10:59:43.165231 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-27z99"] Mar 13 10:59:43.167001 master-0 kubenswrapper[17876]: I0313 10:59:43.166971 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.178924 master-0 kubenswrapper[17876]: I0313 10:59:43.178833 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx"] Mar 13 10:59:43.181162 master-0 kubenswrapper[17876]: I0313 10:59:43.179933 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" Mar 13 10:59:43.181362 master-0 kubenswrapper[17876]: I0313 10:59:43.181319 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" Mar 13 10:59:43.186146 master-0 kubenswrapper[17876]: I0313 10:59:43.186104 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdm8d/must-gather-r2l86"] Mar 13 10:59:43.192655 master-0 kubenswrapper[17876]: I0313 10:59:43.192585 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b8bb4856b-64pzc"] Mar 13 10:59:43.193687 master-0 kubenswrapper[17876]: I0313 10:59:43.193647 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.193760 master-0 kubenswrapper[17876]: I0313 10:59:43.193708 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-metrics-client-ca\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.193760 master-0 kubenswrapper[17876]: I0313 10:59:43.193729 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkbj4\" (UniqueName: \"kubernetes.io/projected/f761e3f1-1350-443e-95dc-10fde5d5ded6-kube-api-access-fkbj4\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.193760 master-0 kubenswrapper[17876]: I0313 10:59:43.193746 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ffcpd\" (UniqueName: \"kubernetes.io/projected/41f9cf46-25ab-4720-b133-7d569e65e625-kube-api-access-ffcpd\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193767 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193808 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69nbp\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-kube-api-access-69nbp\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193825 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193845 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193869 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193891 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plfjr\" (UniqueName: \"kubernetes.io/projected/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-kube-api-access-plfjr\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.193961 master-0 kubenswrapper[17876]: I0313 10:59:43.193938 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.193966 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k9mwv\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-kube-api-access-k9mwv\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.193990 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194008 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-federate-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194028 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194050 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f761e3f1-1350-443e-95dc-10fde5d5ded6-must-gather-output\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194083 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-bound-sa-token\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194138 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.194218 master-0 kubenswrapper[17876]: I0313 10:59:43.194173 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.194502 master-0 kubenswrapper[17876]: I0313 10:59:43.194232 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tcssm\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-kube-api-access-tcssm\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.195833 master-0 kubenswrapper[17876]: I0313 10:59:43.195803 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/f761e3f1-1350-443e-95dc-10fde5d5ded6-must-gather-output\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.195953 master-0 kubenswrapper[17876]: I0313 10:59:43.195862 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-serving-certs-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.198357 master-0 kubenswrapper[17876]: I0313 10:59:43.197927 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-trusted-ca-bundle\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.199492 master-0 kubenswrapper[17876]: I0313 10:59:43.199456 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/41f9cf46-25ab-4720-b133-7d569e65e625-metrics-client-ca\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.201191 master-0 kubenswrapper[17876]: I0313 10:59:43.201155 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-federate-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.203714 master-0 kubenswrapper[17876]: I0313 10:59:43.203670 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl"] Mar 13 10:59:43.214994 master-0 kubenswrapper[17876]: I0313 10:59:43.213328 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-telemeter-client-tls\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.214994 master-0 kubenswrapper[17876]: I0313 10:59:43.213436 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b"] Mar 13 10:59:43.216140 master-0 kubenswrapper[17876]: I0313 10:59:43.215747 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-mndjh" Mar 13 10:59:43.219181 master-0 kubenswrapper[17876]: I0313 10:59:43.219143 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.221782 master-0 kubenswrapper[17876]: I0313 10:59:43.221741 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdm8d/must-gather-7szn2"] Mar 13 10:59:43.223692 master-0 kubenswrapper[17876]: I0313 10:59:43.222848 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/41f9cf46-25ab-4720-b133-7d569e65e625-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.227767 master-0 kubenswrapper[17876]: I0313 10:59:43.216336 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:43.230509 master-0 kubenswrapper[17876]: I0313 10:59:43.230303 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.230974 master-0 kubenswrapper[17876]: I0313 10:59:43.230931 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ffcpd\" (UniqueName: \"kubernetes.io/projected/41f9cf46-25ab-4720-b133-7d569e65e625-kube-api-access-ffcpd\") pod \"telemeter-client-6644589945-tkl8r\" (UID: \"41f9cf46-25ab-4720-b133-7d569e65e625\") " pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.231604 master-0 kubenswrapper[17876]: I0313 10:59:43.231344 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.231604 master-0 kubenswrapper[17876]: I0313 10:59:43.231414 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k9mwv\" (UniqueName: \"kubernetes.io/projected/fbb04691-5fab-4466-98d7-a66e345b9823-kube-api-access-k9mwv\") pod \"cert-manager-webhook-6888856db4-l8t7m\" (UID: \"fbb04691-5fab-4466-98d7-a66e345b9823\") " pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.231770 master-0 kubenswrapper[17876]: I0313 10:59:43.231617 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tcssm\" (UniqueName: \"kubernetes.io/projected/5fd0bffa-9548-4194-b3d7-1ee7787cd1ea-kube-api-access-tcssm\") pod \"cert-manager-cainjector-5545bd876-wk7kn\" (UID: \"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea\") " pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.231995 master-0 kubenswrapper[17876]: I0313 10:59:43.231965 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkbj4\" (UniqueName: \"kubernetes.io/projected/f761e3f1-1350-443e-95dc-10fde5d5ded6-kube-api-access-fkbj4\") pod \"must-gather-7szn2\" (UID: \"f761e3f1-1350-443e-95dc-10fde5d5ded6\") " pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.233158 master-0 kubenswrapper[17876]: I0313 10:59:43.233113 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b8bb4856b-64pzc" Mar 13 10:59:43.235354 master-0 kubenswrapper[17876]: I0313 10:59:43.235324 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-tkl8r"] Mar 13 10:59:43.241330 master-0 kubenswrapper[17876]: I0313 10:59:43.241283 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdm8d/must-gather-7szn2" Mar 13 10:59:43.265635 master-0 kubenswrapper[17876]: I0313 10:59:43.265569 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-dcljb" Mar 13 10:59:43.285990 master-0 kubenswrapper[17876]: I0313 10:59:43.285893 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-l5ddw"] Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.295767 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.295857 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-bound-sa-token\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.295890 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.295931 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc6np\" (UniqueName: \"kubernetes.io/projected/b2222235-3f55-4e45-9632-5a017084e976-kube-api-access-pc6np\") pod \"obo-prometheus-operator-68bc856cb9-wfbgx\" (UID: \"b2222235-3f55-4e45-9632-5a017084e976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.296032 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69nbp\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-kube-api-access-69nbp\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.296059 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.296409 master-0 kubenswrapper[17876]: I0313 10:59:43.296081 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plfjr\" (UniqueName: \"kubernetes.io/projected/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-kube-api-access-plfjr\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.299623 master-0 kubenswrapper[17876]: I0313 10:59:43.299550 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-27z99"] Mar 13 10:59:43.302850 master-0 kubenswrapper[17876]: I0313 10:59:43.302815 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.304743 master-0 kubenswrapper[17876]: I0313 10:59:43.304597 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.304980 master-0 kubenswrapper[17876]: I0313 10:59:43.304947 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wk7kn"] Mar 13 10:59:43.309017 master-0 kubenswrapper[17876]: I0313 10:59:43.308961 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fdm8d/must-gather-r2l86" Mar 13 10:59:43.315633 master-0 kubenswrapper[17876]: I0313 10:59:43.315582 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-bound-sa-token\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.318599 master-0 kubenswrapper[17876]: I0313 10:59:43.318553 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0f5b8e49-4acd-461f-b80f-915a68cccd66-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84\" (UID: \"0f5b8e49-4acd-461f-b80f-915a68cccd66\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.322544 master-0 kubenswrapper[17876]: I0313 10:59:43.322496 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69nbp\" (UniqueName: \"kubernetes.io/projected/eb9cb7c5-480a-4898-9814-fc6852f405d2-kube-api-access-69nbp\") pod \"cert-manager-545d4d4674-l5ddw\" (UID: \"eb9cb7c5-480a-4898-9814-fc6852f405d2\") " pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.323070 master-0 kubenswrapper[17876]: I0313 10:59:43.323038 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plfjr\" (UniqueName: \"kubernetes.io/projected/6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36-kube-api-access-plfjr\") pod \"multus-admission-controller-cb4c85d9-27z99\" (UID: \"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.326706 master-0 kubenswrapper[17876]: I0313 10:59:43.326589 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-mndjh"] Mar 13 10:59:43.396854 master-0 kubenswrapper[17876]: I0313 10:59:43.396800 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc6np\" (UniqueName: \"kubernetes.io/projected/b2222235-3f55-4e45-9632-5a017084e976-kube-api-access-pc6np\") pod \"obo-prometheus-operator-68bc856cb9-wfbgx\" (UID: \"b2222235-3f55-4e45-9632-5a017084e976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" Mar 13 10:59:43.398590 master-0 kubenswrapper[17876]: I0313 10:59:43.398552 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dcljb"] Mar 13 10:59:43.419039 master-0 kubenswrapper[17876]: I0313 10:59:43.418991 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-89944f9bc-f5dt2"] Mar 13 10:59:43.420483 master-0 kubenswrapper[17876]: I0313 10:59:43.420443 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc6np\" (UniqueName: \"kubernetes.io/projected/b2222235-3f55-4e45-9632-5a017084e976-kube-api-access-pc6np\") pod \"obo-prometheus-operator-68bc856cb9-wfbgx\" (UID: \"b2222235-3f55-4e45-9632-5a017084e976\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" Mar 13 10:59:43.433346 master-0 kubenswrapper[17876]: I0313 10:59:43.433291 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84"] Mar 13 10:59:43.442339 master-0 kubenswrapper[17876]: I0313 10:59:43.441619 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx"] Mar 13 10:59:43.472627 master-0 kubenswrapper[17876]: I0313 10:59:43.472293 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l8t7m"] Mar 13 10:59:43.478741 master-0 kubenswrapper[17876]: I0313 10:59:43.472977 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" Mar 13 10:59:43.482208 master-0 kubenswrapper[17876]: I0313 10:59:43.481724 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" Mar 13 10:59:43.504609 master-0 kubenswrapper[17876]: I0313 10:59:43.504557 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" Mar 13 10:59:43.531149 master-0 kubenswrapper[17876]: I0313 10:59:43.531116 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" Mar 13 10:59:43.571897 master-0 kubenswrapper[17876]: I0313 10:59:43.565743 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-l5ddw" Mar 13 10:59:43.607113 master-0 kubenswrapper[17876]: I0313 10:59:43.605918 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" Mar 13 10:59:43.613758 master-0 kubenswrapper[17876]: I0313 10:59:43.613707 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" Mar 13 10:59:43.889142 master-0 kubenswrapper[17876]: I0313 10:59:43.888687 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl"] Mar 13 10:59:43.942129 master-0 kubenswrapper[17876]: W0313 10:59:43.938865 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod244bdfab_08d8_4388_9791_b9fb3ea1a63e.slice/crio-8e17ea8a05eb6d30ee0966ae0369dc1d2869255b4567b3d2875fdb3ead857f7b WatchSource:0}: Error finding container 8e17ea8a05eb6d30ee0966ae0369dc1d2869255b4567b3d2875fdb3ead857f7b: Status 404 returned error can't find the container with id 8e17ea8a05eb6d30ee0966ae0369dc1d2869255b4567b3d2875fdb3ead857f7b Mar 13 10:59:43.943399 master-0 kubenswrapper[17876]: I0313 10:59:43.942814 17876 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 13 10:59:44.102136 master-0 kubenswrapper[17876]: I0313 10:59:44.101305 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b"] Mar 13 10:59:44.171189 master-0 kubenswrapper[17876]: W0313 10:59:44.170707 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0ca5f9ff_8b93_4e3c_bec4_5d2ef1aa68e4.slice/crio-68a7c8c72f84318f8d8fa1409177bfc18b7cb4380baaa00cae5aa6e2b3d29db4 WatchSource:0}: Error finding container 68a7c8c72f84318f8d8fa1409177bfc18b7cb4380baaa00cae5aa6e2b3d29db4: Status 404 returned error can't find the container with id 68a7c8c72f84318f8d8fa1409177bfc18b7cb4380baaa00cae5aa6e2b3d29db4 Mar 13 10:59:44.198972 master-0 kubenswrapper[17876]: I0313 10:59:44.198929 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-mndjh"] Mar 13 10:59:44.208196 master-0 kubenswrapper[17876]: I0313 10:59:44.206635 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-89944f9bc-f5dt2"] Mar 13 10:59:44.286583 master-0 kubenswrapper[17876]: I0313 10:59:44.286514 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b8bb4856b-64pzc"] Mar 13 10:59:44.286782 master-0 kubenswrapper[17876]: W0313 10:59:44.286595 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1a552e5f_3a19_4d3d_91ae_1b979910b87c.slice/crio-6e2151ebe2c5d6d02e9f4dbfc210bb5d497757fa957502c81a8f59d08adc58e8 WatchSource:0}: Error finding container 6e2151ebe2c5d6d02e9f4dbfc210bb5d497757fa957502c81a8f59d08adc58e8: Status 404 returned error can't find the container with id 6e2151ebe2c5d6d02e9f4dbfc210bb5d497757fa957502c81a8f59d08adc58e8 Mar 13 10:59:44.303706 master-0 kubenswrapper[17876]: I0313 10:59:44.303538 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdm8d/must-gather-r2l86"] Mar 13 10:59:44.311063 master-0 kubenswrapper[17876]: I0313 10:59:44.311014 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-dcljb"] Mar 13 10:59:44.324355 master-0 kubenswrapper[17876]: W0313 10:59:44.323999 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1708115_95b0_47fc_a0f3_0fbdb3786cd4.slice/crio-5ed491b0ed896193f9f7c8d6746c7d4da626da5fb00a81d041d331c9d97e6232 WatchSource:0}: Error finding container 5ed491b0ed896193f9f7c8d6746c7d4da626da5fb00a81d041d331c9d97e6232: Status 404 returned error can't find the container with id 5ed491b0ed896193f9f7c8d6746c7d4da626da5fb00a81d041d331c9d97e6232 Mar 13 10:59:44.445417 master-0 kubenswrapper[17876]: W0313 10:59:44.445360 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5fd0bffa_9548_4194_b3d7_1ee7787cd1ea.slice/crio-77eb247c503be51ed1d423730a26532f7d0436baf6c58682d293df2eccc63c81 WatchSource:0}: Error finding container 77eb247c503be51ed1d423730a26532f7d0436baf6c58682d293df2eccc63c81: Status 404 returned error can't find the container with id 77eb247c503be51ed1d423730a26532f7d0436baf6c58682d293df2eccc63c81 Mar 13 10:59:44.452348 master-0 kubenswrapper[17876]: I0313 10:59:44.448045 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-wk7kn"] Mar 13 10:59:44.452348 master-0 kubenswrapper[17876]: W0313 10:59:44.448769 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfbb04691_5fab_4466_98d7_a66e345b9823.slice/crio-27c5c5921f8c0ef5e6541c6d2f18512db160c156eef98be4f98fde6bff94e851 WatchSource:0}: Error finding container 27c5c5921f8c0ef5e6541c6d2f18512db160c156eef98be4f98fde6bff94e851: Status 404 returned error can't find the container with id 27c5c5921f8c0ef5e6541c6d2f18512db160c156eef98be4f98fde6bff94e851 Mar 13 10:59:44.456442 master-0 kubenswrapper[17876]: I0313 10:59:44.456052 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-l8t7m"] Mar 13 10:59:44.462869 master-0 kubenswrapper[17876]: I0313 10:59:44.462773 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx"] Mar 13 10:59:44.463216 master-0 kubenswrapper[17876]: W0313 10:59:44.463169 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2222235_3f55_4e45_9632_5a017084e976.slice/crio-140c4273f90b9b1e279dfb3391468e7b1c4baf32185c7ad3f06b1d07fb43cbb0 WatchSource:0}: Error finding container 140c4273f90b9b1e279dfb3391468e7b1c4baf32185c7ad3f06b1d07fb43cbb0: Status 404 returned error can't find the container with id 140c4273f90b9b1e279dfb3391468e7b1c4baf32185c7ad3f06b1d07fb43cbb0 Mar 13 10:59:44.478684 master-0 kubenswrapper[17876]: W0313 10:59:44.478627 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf761e3f1_1350_443e_95dc_10fde5d5ded6.slice/crio-c810e9112264baabece1c85d7e8a6663d9508a68a7c556174c719b2771576f99 WatchSource:0}: Error finding container c810e9112264baabece1c85d7e8a6663d9508a68a7c556174c719b2771576f99: Status 404 returned error can't find the container with id c810e9112264baabece1c85d7e8a6663d9508a68a7c556174c719b2771576f99 Mar 13 10:59:44.485347 master-0 kubenswrapper[17876]: I0313 10:59:44.485300 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fdm8d/must-gather-7szn2"] Mar 13 10:59:44.636894 master-0 kubenswrapper[17876]: I0313 10:59:44.636812 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-27z99"] Mar 13 10:59:44.655176 master-0 kubenswrapper[17876]: W0313 10:59:44.655125 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podeb9cb7c5_480a_4898_9814_fc6852f405d2.slice/crio-53de1cb4c1b20a7b9b972f7df4e24fd9cb811a85ce0c4297297dbe349f7ce9fc WatchSource:0}: Error finding container 53de1cb4c1b20a7b9b972f7df4e24fd9cb811a85ce0c4297297dbe349f7ce9fc: Status 404 returned error can't find the container with id 53de1cb4c1b20a7b9b972f7df4e24fd9cb811a85ce0c4297297dbe349f7ce9fc Mar 13 10:59:44.673234 master-0 kubenswrapper[17876]: I0313 10:59:44.673183 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-l5ddw"] Mar 13 10:59:44.678437 master-0 kubenswrapper[17876]: W0313 10:59:44.678386 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f5b8e49_4acd_461f_b80f_915a68cccd66.slice/crio-d2969cce66e44d3dc5662b5d663e7d0bdcec25e83a2d7a024dd0b9dd0de129e4 WatchSource:0}: Error finding container d2969cce66e44d3dc5662b5d663e7d0bdcec25e83a2d7a024dd0b9dd0de129e4: Status 404 returned error can't find the container with id d2969cce66e44d3dc5662b5d663e7d0bdcec25e83a2d7a024dd0b9dd0de129e4 Mar 13 10:59:44.680319 master-0 kubenswrapper[17876]: W0313 10:59:44.680264 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f9cf46_25ab_4720_b133_7d569e65e625.slice/crio-a6a9ccd22bec0eea8c86bd2c483d8b81aae6be86576fada7690d0272c9dd8da5 WatchSource:0}: Error finding container a6a9ccd22bec0eea8c86bd2c483d8b81aae6be86576fada7690d0272c9dd8da5: Status 404 returned error can't find the container with id a6a9ccd22bec0eea8c86bd2c483d8b81aae6be86576fada7690d0272c9dd8da5 Mar 13 10:59:44.683552 master-0 kubenswrapper[17876]: I0313 10:59:44.683233 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84"] Mar 13 10:59:44.697114 master-0 kubenswrapper[17876]: I0313 10:59:44.696085 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-6644589945-tkl8r"] Mar 13 10:59:44.735180 master-0 kubenswrapper[17876]: I0313 10:59:44.734334 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-l8t7m" event={"ID":"fbb04691-5fab-4466-98d7-a66e345b9823","Type":"ContainerStarted","Data":"27c5c5921f8c0ef5e6541c6d2f18512db160c156eef98be4f98fde6bff94e851"} Mar 13 10:59:44.737866 master-0 kubenswrapper[17876]: I0313 10:59:44.737780 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-mndjh" event={"ID":"fb2ed9c3-da42-4b9a-a79f-3869cf1076e9","Type":"ContainerStarted","Data":"04f0c4f89aa8fce9d4545976674d0160d2b00793152b2275396f27c4ca0d4f74"} Mar 13 10:59:44.741251 master-0 kubenswrapper[17876]: I0313 10:59:44.741207 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" event={"ID":"b5fb8df2-3f81-41d8-b588-7966f8e499f0","Type":"ContainerStarted","Data":"c0b1ec7bb3e98691d6bd6eb0f840b7333c526f325a58c77beb1427069afc3d06"} Mar 13 10:59:44.741251 master-0 kubenswrapper[17876]: I0313 10:59:44.741254 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" event={"ID":"b5fb8df2-3f81-41d8-b588-7966f8e499f0","Type":"ContainerStarted","Data":"317c74aff42a7ca369276e247cb0b102194333de5a495ff757e78bd8d3256a92"} Mar 13 10:59:44.743326 master-0 kubenswrapper[17876]: I0313 10:59:44.743280 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:44.748325 master-0 kubenswrapper[17876]: I0313 10:59:44.746667 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" event={"ID":"41f9cf46-25ab-4720-b133-7d569e65e625","Type":"ContainerStarted","Data":"a6a9ccd22bec0eea8c86bd2c483d8b81aae6be86576fada7690d0272c9dd8da5"} Mar 13 10:59:44.748792 master-0 kubenswrapper[17876]: I0313 10:59:44.748743 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdm8d/must-gather-7szn2" event={"ID":"f761e3f1-1350-443e-95dc-10fde5d5ded6","Type":"ContainerStarted","Data":"c810e9112264baabece1c85d7e8a6663d9508a68a7c556174c719b2771576f99"} Mar 13 10:59:44.751688 master-0 kubenswrapper[17876]: I0313 10:59:44.751650 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-jgj4b" event={"ID":"0ca5f9ff-8b93-4e3c-bec4-5d2ef1aa68e4","Type":"ContainerStarted","Data":"68a7c8c72f84318f8d8fa1409177bfc18b7cb4380baaa00cae5aa6e2b3d29db4"} Mar 13 10:59:44.753617 master-0 kubenswrapper[17876]: I0313 10:59:44.753585 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-54dcbb7cdc-7ps84" event={"ID":"0f5b8e49-4acd-461f-b80f-915a68cccd66","Type":"ContainerStarted","Data":"d2969cce66e44d3dc5662b5d663e7d0bdcec25e83a2d7a024dd0b9dd0de129e4"} Mar 13 10:59:44.758420 master-0 kubenswrapper[17876]: I0313 10:59:44.758389 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-9c6b5c975-mh7gl" event={"ID":"244bdfab-08d8-4388-9791-b9fb3ea1a63e","Type":"ContainerStarted","Data":"8e17ea8a05eb6d30ee0966ae0369dc1d2869255b4567b3d2875fdb3ead857f7b"} Mar 13 10:59:44.760604 master-0 kubenswrapper[17876]: I0313 10:59:44.760528 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b8bb4856b-64pzc" event={"ID":"1a552e5f-3a19-4d3d-91ae-1b979910b87c","Type":"ContainerStarted","Data":"bea81d9d3a99076fc3a5f5fa7f102b7699a3d001c10a10c17d61f286793dd453"} Mar 13 10:59:44.760686 master-0 kubenswrapper[17876]: I0313 10:59:44.760610 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b8bb4856b-64pzc" event={"ID":"1a552e5f-3a19-4d3d-91ae-1b979910b87c","Type":"ContainerStarted","Data":"6e2151ebe2c5d6d02e9f4dbfc210bb5d497757fa957502c81a8f59d08adc58e8"} Mar 13 10:59:44.770077 master-0 kubenswrapper[17876]: I0313 10:59:44.769991 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" podStartSLOduration=195.769968627 podStartE2EDuration="3m15.769968627s" podCreationTimestamp="2026-03-13 10:56:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:59:44.765784399 +0000 UTC m=+1092.601590875" watchObservedRunningTime="2026-03-13 10:59:44.769968627 +0000 UTC m=+1092.605775103" Mar 13 10:59:44.770404 master-0 kubenswrapper[17876]: I0313 10:59:44.770352 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" event={"ID":"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36","Type":"ContainerStarted","Data":"433d82553a9b3af9092d57207783c9b120635371c0d06871bee3b1d2819d1ef0"} Mar 13 10:59:44.780272 master-0 kubenswrapper[17876]: I0313 10:59:44.779805 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-wfbgx" event={"ID":"b2222235-3f55-4e45-9632-5a017084e976","Type":"ContainerStarted","Data":"140c4273f90b9b1e279dfb3391468e7b1c4baf32185c7ad3f06b1d07fb43cbb0"} Mar 13 10:59:44.784164 master-0 kubenswrapper[17876]: I0313 10:59:44.784047 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-l5ddw" event={"ID":"eb9cb7c5-480a-4898-9814-fc6852f405d2","Type":"ContainerStarted","Data":"53de1cb4c1b20a7b9b972f7df4e24fd9cb811a85ce0c4297297dbe349f7ce9fc"} Mar 13 10:59:44.788820 master-0 kubenswrapper[17876]: I0313 10:59:44.788747 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-wk7kn" event={"ID":"5fd0bffa-9548-4194-b3d7-1ee7787cd1ea","Type":"ContainerStarted","Data":"77eb247c503be51ed1d423730a26532f7d0436baf6c58682d293df2eccc63c81"} Mar 13 10:59:44.794779 master-0 kubenswrapper[17876]: I0313 10:59:44.794701 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdm8d/must-gather-r2l86" event={"ID":"7553f594-9ac7-4a26-8df1-fe7dd957681b","Type":"ContainerStarted","Data":"81dc3a4cf1683886cc28bde4e176e1d3d48e21218e089ada9c234f91beb29263"} Mar 13 10:59:44.795031 master-0 kubenswrapper[17876]: I0313 10:59:44.794871 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b8bb4856b-64pzc" podStartSLOduration=177.79485622 podStartE2EDuration="2m57.79485622s" podCreationTimestamp="2026-03-13 10:56:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:59:44.785941008 +0000 UTC m=+1092.621747494" watchObservedRunningTime="2026-03-13 10:59:44.79485622 +0000 UTC m=+1092.630662716" Mar 13 10:59:44.810043 master-0 kubenswrapper[17876]: I0313 10:59:44.809986 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-dcljb" event={"ID":"d1708115-95b0-47fc-a0f3-0fbdb3786cd4","Type":"ContainerStarted","Data":"5ed491b0ed896193f9f7c8d6746c7d4da626da5fb00a81d041d331c9d97e6232"} Mar 13 10:59:45.485110 master-0 kubenswrapper[17876]: I0313 10:59:45.482842 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-89944f9bc-f5dt2" Mar 13 10:59:45.868598 master-0 kubenswrapper[17876]: I0313 10:59:45.868461 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" event={"ID":"41f9cf46-25ab-4720-b133-7d569e65e625","Type":"ContainerStarted","Data":"fb30da5c2f45359610716730184c8084b316186d6ebfb032a3874308b4992055"} Mar 13 10:59:45.868598 master-0 kubenswrapper[17876]: I0313 10:59:45.868526 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" event={"ID":"41f9cf46-25ab-4720-b133-7d569e65e625","Type":"ContainerStarted","Data":"934611903adbda5277a0ea54fed07eda017b173265be644c6037537597c984bf"} Mar 13 10:59:45.868598 master-0 kubenswrapper[17876]: I0313 10:59:45.868537 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" event={"ID":"41f9cf46-25ab-4720-b133-7d569e65e625","Type":"ContainerStarted","Data":"d5486e72c436a9430afce0476be0adf2afed4883ab0a40f12ab51c2545d9606d"} Mar 13 10:59:45.874113 master-0 kubenswrapper[17876]: I0313 10:59:45.874046 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" event={"ID":"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36","Type":"ContainerStarted","Data":"b0e1cb3f33f65a2c550ace14487651ca9234630dc8b17769852544433a4b5f47"} Mar 13 10:59:45.874249 master-0 kubenswrapper[17876]: I0313 10:59:45.874129 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" event={"ID":"6f1d903f-a47a-4c8c-aa6f-61fb1ee58b36","Type":"ContainerStarted","Data":"00ed47cc2b690fb962e74e10624cac11028d5598fe4d2ed43e7b3a92cb38d84e"} Mar 13 10:59:45.916293 master-0 kubenswrapper[17876]: I0313 10:59:45.916163 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-6644589945-tkl8r" podStartSLOduration=154.916144441 podStartE2EDuration="2m34.916144441s" podCreationTimestamp="2026-03-13 10:57:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:59:45.904809241 +0000 UTC m=+1093.740615717" watchObservedRunningTime="2026-03-13 10:59:45.916144441 +0000 UTC m=+1093.751950917" Mar 13 10:59:45.967332 master-0 kubenswrapper[17876]: I0313 10:59:45.962494 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-cb4c85d9-27z99" podStartSLOduration=171.96247222 podStartE2EDuration="2m51.96247222s" podCreationTimestamp="2026-03-13 10:56:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:59:45.942490985 +0000 UTC m=+1093.778297471" watchObservedRunningTime="2026-03-13 10:59:45.96247222 +0000 UTC m=+1093.798278696" Mar 13 10:59:46.030677 master-0 kubenswrapper[17876]: I0313 10:59:46.025797 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-7769569c45-6lqz5"] Mar 13 10:59:46.030677 master-0 kubenswrapper[17876]: I0313 10:59:46.026149 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" podUID="9ca1b7c7-41af-46e9-8f5d-a476ee2b7587" containerName="multus-admission-controller" containerID="cri-o://2374456736ebc7d72463b6654e06d916657c29a267fba9a956c950f521d8de03" gracePeriod=30 Mar 13 10:59:46.030677 master-0 kubenswrapper[17876]: I0313 10:59:46.026626 17876 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" podUID="9ca1b7c7-41af-46e9-8f5d-a476ee2b7587" containerName="kube-rbac-proxy" containerID="cri-o://8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f" gracePeriod=30 Mar 13 10:59:46.217173 master-0 kubenswrapper[17876]: I0313 10:59:46.215933 17876 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-b8bb4856b-64pzc"] Mar 13 10:59:46.520426 master-0 kubenswrapper[17876]: I0313 10:59:46.520301 17876 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-76d5dcff75-c6xfz"] Mar 13 10:59:46.521962 master-0 kubenswrapper[17876]: I0313 10:59:46.521924 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.601058 master-0 kubenswrapper[17876]: I0313 10:59:46.600924 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5dcff75-c6xfz"] Mar 13 10:59:46.692857 master-0 kubenswrapper[17876]: I0313 10:59:46.692760 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-service-ca\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.693118 master-0 kubenswrapper[17876]: I0313 10:59:46.692874 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59j25\" (UniqueName: \"kubernetes.io/projected/eb610ac4-629c-4c56-8c94-a460c5146685-kube-api-access-59j25\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.693118 master-0 kubenswrapper[17876]: I0313 10:59:46.692928 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-console-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.693118 master-0 kubenswrapper[17876]: I0313 10:59:46.693002 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-oauth-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.694452 master-0 kubenswrapper[17876]: I0313 10:59:46.694396 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-oauth-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.694591 master-0 kubenswrapper[17876]: I0313 10:59:46.694555 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-trusted-ca-bundle\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.694653 master-0 kubenswrapper[17876]: I0313 10:59:46.694640 17876 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796176 master-0 kubenswrapper[17876]: I0313 10:59:46.796042 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59j25\" (UniqueName: \"kubernetes.io/projected/eb610ac4-629c-4c56-8c94-a460c5146685-kube-api-access-59j25\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796431 master-0 kubenswrapper[17876]: I0313 10:59:46.796195 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-console-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796431 master-0 kubenswrapper[17876]: I0313 10:59:46.796232 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-oauth-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796431 master-0 kubenswrapper[17876]: I0313 10:59:46.796295 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-oauth-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796431 master-0 kubenswrapper[17876]: I0313 10:59:46.796417 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-trusted-ca-bundle\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796655 master-0 kubenswrapper[17876]: I0313 10:59:46.796462 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.796689 master-0 kubenswrapper[17876]: I0313 10:59:46.796652 17876 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-service-ca\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.797828 master-0 kubenswrapper[17876]: I0313 10:59:46.797791 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-console-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.798768 master-0 kubenswrapper[17876]: I0313 10:59:46.798732 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-service-ca\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.799440 master-0 kubenswrapper[17876]: I0313 10:59:46.799333 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-trusted-ca-bundle\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.806748 master-0 kubenswrapper[17876]: I0313 10:59:46.806693 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb610ac4-629c-4c56-8c94-a460c5146685-oauth-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.816201 master-0 kubenswrapper[17876]: I0313 10:59:46.816149 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59j25\" (UniqueName: \"kubernetes.io/projected/eb610ac4-629c-4c56-8c94-a460c5146685-kube-api-access-59j25\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.817622 master-0 kubenswrapper[17876]: I0313 10:59:46.817587 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-oauth-config\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.845047 master-0 kubenswrapper[17876]: I0313 10:59:46.844997 17876 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb610ac4-629c-4c56-8c94-a460c5146685-console-serving-cert\") pod \"console-76d5dcff75-c6xfz\" (UID: \"eb610ac4-629c-4c56-8c94-a460c5146685\") " pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:46.905609 master-0 kubenswrapper[17876]: I0313 10:59:46.905557 17876 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76d5dcff75-c6xfz" Mar 13 10:59:47.710469 master-0 kubenswrapper[17876]: E0313 10:59:47.710400 17876 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca1b7c7_41af_46e9_8f5d_a476ee2b7587.slice/crio-8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9ca1b7c7_41af_46e9_8f5d_a476ee2b7587.slice/crio-conmon-8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f.scope\": RecentStats: unable to find data in memory cache]" Mar 13 10:59:47.717423 master-0 kubenswrapper[17876]: W0313 10:59:47.717374 17876 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb610ac4_629c_4c56_8c94_a460c5146685.slice/crio-90c903f8b6a3e54116c2c888bddf74c850f26b029aa4371cf9b697045d8a6ee4 WatchSource:0}: Error finding container 90c903f8b6a3e54116c2c888bddf74c850f26b029aa4371cf9b697045d8a6ee4: Status 404 returned error can't find the container with id 90c903f8b6a3e54116c2c888bddf74c850f26b029aa4371cf9b697045d8a6ee4 Mar 13 10:59:47.718786 master-0 kubenswrapper[17876]: I0313 10:59:47.718764 17876 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76d5dcff75-c6xfz"] Mar 13 10:59:47.934518 master-0 kubenswrapper[17876]: I0313 10:59:47.934388 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5dcff75-c6xfz" event={"ID":"eb610ac4-629c-4c56-8c94-a460c5146685","Type":"ContainerStarted","Data":"90c903f8b6a3e54116c2c888bddf74c850f26b029aa4371cf9b697045d8a6ee4"} Mar 13 10:59:47.938677 master-0 kubenswrapper[17876]: I0313 10:59:47.938596 17876 generic.go:334] "Generic (PLEG): container finished" podID="9ca1b7c7-41af-46e9-8f5d-a476ee2b7587" containerID="8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f" exitCode=0 Mar 13 10:59:47.938794 master-0 kubenswrapper[17876]: I0313 10:59:47.938688 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-7769569c45-6lqz5" event={"ID":"9ca1b7c7-41af-46e9-8f5d-a476ee2b7587","Type":"ContainerDied","Data":"8b943a42acd2c58a9a47a182ee54d2986f5b6361dae18ac83e4c2c1569753d0f"} Mar 13 10:59:47.948076 master-0 kubenswrapper[17876]: I0313 10:59:47.947987 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdm8d/must-gather-r2l86" event={"ID":"7553f594-9ac7-4a26-8df1-fe7dd957681b","Type":"ContainerStarted","Data":"6f340227dae04e80ba74140920e8920d446c3ef47afdb73d5408f34deb69cb0a"} Mar 13 10:59:48.964522 master-0 kubenswrapper[17876]: I0313 10:59:48.964463 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fdm8d/must-gather-r2l86" event={"ID":"7553f594-9ac7-4a26-8df1-fe7dd957681b","Type":"ContainerStarted","Data":"135ceeeae17cc3c43fa153b5fb2b025a675864420504b7cfbaf730bfe7f1c29b"} Mar 13 10:59:48.976588 master-0 kubenswrapper[17876]: I0313 10:59:48.971836 17876 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76d5dcff75-c6xfz" event={"ID":"eb610ac4-629c-4c56-8c94-a460c5146685","Type":"ContainerStarted","Data":"8b126efc0112d55ef02504300970431ba9dd57ffaa1654463932ea74096ed383"} Mar 13 10:59:48.987355 master-0 kubenswrapper[17876]: I0313 10:59:48.987237 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fdm8d/must-gather-r2l86" podStartSLOduration=180.031101188 podStartE2EDuration="3m2.987196796s" podCreationTimestamp="2026-03-13 10:56:46 +0000 UTC" firstStartedPulling="2026-03-13 10:59:44.332738897 +0000 UTC m=+1092.168545373" lastFinishedPulling="2026-03-13 10:59:47.288834505 +0000 UTC m=+1095.124640981" observedRunningTime="2026-03-13 10:59:48.985964281 +0000 UTC m=+1096.821770757" watchObservedRunningTime="2026-03-13 10:59:48.987196796 +0000 UTC m=+1096.823003272" Mar 13 10:59:49.039967 master-0 kubenswrapper[17876]: I0313 10:59:49.039889 17876 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76d5dcff75-c6xfz" podStartSLOduration=3.039456412 podStartE2EDuration="3.039456412s" podCreationTimestamp="2026-03-13 10:59:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-13 10:59:49.013817908 +0000 UTC m=+1096.849624384" watchObservedRunningTime="2026-03-13 10:59:49.039456412 +0000 UTC m=+1096.875262888" Mar 13 10:59:51.909709 master-0 kubenswrapper[17876]: I0313 10:59:51.909640 17876 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-zg9h2_b5ed7aff-47c0-42f3-9a26-9385d2bde582/cluster-version-operator/0.log" Mar 13 10:59:53.236238 master-0 kubenswrapper[17876]: I0313 10:59:53.236049 17876 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-b8bb4856b-64pzc"